The online gaming industry is booming. With an annual growth rate estimated at 12.1%, the global gaming market size will reach 435 billion USD by 2028. While the video game industry has been a vibrant market since the 1990s, the Covid pandemic brought an unprecedented change to the industry. During the lockdown, online gaming became a major channel for people to connect with friends and strangers – transforming gaming from only entertainment to a social experience. However, serious problems have also emerged in these new social spaces.
An overwhelming majority of the community states that they have encountered online harassment while gaming. More dangerously, extremist content finds new forums for propagation and mobilization in these channels, as seen in recreations of mass shooting scenes with multiple connected devices. Thus, it is now more important than ever to pay attention to online trust & safetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not limited to consideration of safety-by-design, product governance, risk assessment, detection, response, quality assurance, and transparency. See also: Safety by design More in the gaming industry.
Challenges to online gaming regulations
Effective strategies for regulating user-generated or interactional content are largely missing in the traditionally self-regulated online gaming industry. Conventional regulations on video games – such as age-rating systems based on the degree of violence, strong language, sexual content, and other illicit practices – only apply to content released by developers and are yet to extend to user-generated content. A rating system works well for console games that usually do not have user-interaction features. However, for games that involve multiple connected players and allow real-time interaction among them, an ex-ante rating system cannot evaluate the risk of exposure to harmful or illegal content created by other gamers.
Lists of banned words and user report systems are widely implemented across games, but both have considerable limits. Banned words can be easily circumvented by inventing slang terms and can snowball into censoring content that is not necessarily harmful or illegal. Furthermore, report systems often suffer from overburdening, the inconsistency of decisions made by human moderators, and the algorithm’s failure to understand nuanced cases.
Apart from specific technical implementation issues, business considerations also affect content moderationReviewing user-generated content to ensure that it complies with a platform’s T&C as well as with legal guidelines. See also: Content Moderator More in online gaming. A key problem is that gaming platforms demonstrate very different standards in content moderation and are not governed by a clear and consistent regulatory framework. For example, Nintendo is famous for its particularly strict content moderation policy due to its family-friendly brand, whereas other studios that produce mature-rated or adult-only games hold a relatively tolerant attitude towards deviant speech and conduct.
Future trends in regulating online gaming
Concerning the unique “interactional risk” in online gaming with social features, a major trend is to combine child protection law with legal prescriptions for online content moderation, since these vulnerable gamers have long been active participants in the industry.
Germany’s youth protection law amended in April 2021 now integrates the in-game communication environment into the reformed age-rating standard for video games, and those with unlimited chat functions will receive a higher age-rating. On the other hand, the UK Draft Online Safety Bill published in May 2022 also gives special focus to online content accessed by children, stating that platforms hosting user-generated content have tailored duties for minimizing harmful content’s presence, reporting abusive content against children, and assisting law enforcement departments if needed.
In the European Union, another crucial change is to put the online gaming industry under the general regulations for online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... More that provide hosting services. The recent European Regulation on preventing the dissemination of terrorist content online and coming Digital Services Act (DSA) in the European Union, are also going to impact the online gaming industry, irrespective of gaming companies’ countries of establishment.
Indeed, according to DSA, gaming companies will now be obliged to:
- Set up user report or flaggingFlagging is a term used interchangeably with reporting, which refers to the act of requesting a review of online content, conduct, or a user account. The content can be flagged by an algorithm, a content moderator, or another user. More systems, which enables submitting detailed and precise information about the flagged content;
- Set up complaint-handling systems, for processing complaints against their content moderation decisions (small and micro enterprises are exempt from this duty);
- Disclose information about their content moderation policies, procedures, measures, and tools to users;
- Publish transparency reports at least once a year, which should include number of cases processed, number of complaints received, type of measures taken against flagged content, etc.
More importantly, in cases of detection of illegal content, gaming companies are now expected to assume more responsibilities, including:
- Process notices from trusted flaggersGenerally, this refers to individuals or entities that have proven expertise in flagging harmful or illegal content to online service providers. Within the meaning of the DSA, trusted flaggers are entities that have been awarded an official status by a Digital Service Coordinator. Online platforms will need to ensure notices from such organizations are treated with priority. More with priority and without delay;
- Suspend accounts of frequent offenders and also of those who frequently submit unfounded reports of illegal content or complaints;
- Promptly inform authorities and provide all relevant information, if they are aware of any serious criminal offense.
For “very large online platforms”, there will be extra requirements for risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable... More, independent audit, transparency reportA transparency report is a document released by an organization that discloses information related to its policies, practices, and actions. Typically, transparency reports provide details on the handling of requests for user data and content removal, as well as government requests for user records, among other relevant metrics and insights. More, etc., which may possibly affect major players in the market, such as Microsoft, Sony, Nintendo, and Steam, if the industry keeps expanding at the current rate. In response to the DSA, the European gaming industry is calling for more detailed and nuanced regulations to address the complex and diverse services in the ecosystem. However, one key trend is certain: online gaming platforms will no longer stay self-regulated without direct intervention from governments, and they will be held accountable for not investing enough effort in combating their users’ illegal speech and conduct.
Tremau Policy Research Team