* This content is AI generated. It is suggested to read the full transcript for any furthur clarity.
Ladies and gentlemen, on Corporate Digital Responsibility, and to talk about this relevant topic—responsibility in gaming—may I please call upon the stage Mr Sumanta, Head of Public Policy, Digital Works. Ladies and gentlemen, he is Head of Corporate Affairs and Public Policy at HCW, a leading online gaming company in India. His mandate includes handling public policy advocacy, strategic external engagement, corporate affairs, and public relations. May I please call him on stage. Thank you.
Good morning everyone. First of all, thank you to the SKOCH Group for hosting this wonderful event and for inviting me to speak on responsible digital gaming.
Digital gaming, or online gaming—you pick up the newspaper and most likely you would see a news article every other day. It is a very large sector. Almost 440 million people in India, which is over 44 crore people, actually play some form of digital games. Between 10 to 12 hours a week is spent on digital games, and overall more than one lakh people are employed in the sector. So from an economic and social perspective, this sector is actually very crucial—not only to the growth story, but it also has a social angle to it. That is something we want to tackle as part of a task force.
If we speak about why responsible digital gaming is actually needed, we have just heard about the size of the sector and the market potential. But digital gaming connects people across social, psychological, and financial streams, and therefore some responsible gaming framework is absolutely needed—to protect players from potential harm (social, psychological, and financial), to prevent underage gaming, to ensure games are fair and transparent, and to ensure the industry’s long-term sustainability.
While large platforms may already be following some of these parameters,
If we speak about why responsible digital gaming is actually needed, we have just heard about the size of the sector and the market potential. But digital gaming connects people across social, psychological, and financial streams, and therefore some responsible gaming framework is absolutely needed—to protect players from potential harm (social, psychological, and financial), to prevent underage gaming, to ensure games are fair and transparent, and to ensure the industry’s long-term sustainability.
While large platforms may already be following some of these parameters, the need is to establish a baseline that can be amplified through regulatory means, so that it becomes an industry standard. But before we get into that, we wanted to see whether it actually works. At the end of the day, you can tell consumers how they should behave, but do they actually listen?
So we analyzed data from some of the largest platforms to see how consumers respond to limits placed on them in a digital gaming environment. The first thing we checked was KYC norms. The government has very clear KYC norms, and we found that on average, eight out of ten new registered users are actually rejected by platforms because they fail to meet one or more KYC requirements.
If you look at the real-money gaming sector, many large platforms nudge customers to set limits on the financial amount they play with. On average, what we have seen is that the blue bar on the chart represents the number of users who have set their own limits, while the orange bar represents the number of users who have come back to the platform asking for their limits to be increased. On average, only around 6% of players have asked for limits to be increased.
Self-exclusion is a very important concept—not only for digital gaming, but for the entire digital ecosystem, whether it is OTT platforms or even social media. The aim is to ensure that people realize when they are spending more time than required on a digital medium, and therefore an option to self-exclude should be provided.
What we have seen is that, on average, among users who opted for self-exclusion, only 5% came back asking for the self-exclusion to be removed. Those who do not opt for self-exclusion are automatically analyzed by platforms for problematic behavior, and less than 0.5% of players who are mandatorily cooled off come back asking for a change.
What this shows is that customers actually respond positively to sensitization measures placed by digital platforms. So it is a matter of education, a matter of sensitization, and also a matter of establishing a baseline for the entire industry—not just for large legal operators, but also for smaller startups that are coming up, so that they have a reference point for their operations.
In this context, along with the SKOCH Group, a task force was set up comprising government representatives, academicians, think tanks, and civil society, to create a responsibility framework for digital gaming. Broadly, the Corporate Digital Responsibility framework for gaming includes parameters such as responsibility, integrity, gaming risk management, professionalism, marketing and advertising, social responsibility, mental health, and multilingual use.
We have also studied, across each of these parameters, where some of the largest developed economies stand—whether they have established baselines or not—and where India stands vis-à-vis those baselines. The rest of the framework is quite detailed, so I will not go into it here.
I would like to conclude by saying that Corporate Digital Responsibility is an extremely important governance framework for digital businesses in India, especially if they are looking to scale. What we have seen from existing data is that it absolutely works, and therefore we should encourage it and move forward with it.
Thank you.
Participants at the Indices for Viksit Bharat