Regulating Online Gaming: Challenges and Future Landscape

The online gaming industry is booming. With an annual growth rate estimated at 12.1%, the global gaming market size will reach 435 billion USD by 2028. While the video game industry has been a vibrant market since the 1990s, the Covid pandemic brought an unprecedented change to the industry. During the lockdown, online gaming became a major channel for people to connect with friends and strangers – transforming gaming from only entertainment to a social experience. However, serious problems have also emerged in these new social spaces. 

An overwhelming majority of the community states that they have encountered online harassment while gaming. More dangerously, extremist content finds new forums for propagation and mobilization in these channels, as seen in recreations of mass shooting scenes with multiple connected devices. Thus, it is now more important than ever to pay attention to online trust & safety in the gaming industry. 

Challenges to online gaming regulations

Effective strategies for regulating user-generated or interactional content are largely missing in the traditionally self-regulated online gaming industry. Conventional regulations on video games – such as age-rating systems based on the degree of violence, strong language, sexual content, and other illicit practices – only apply to content released by developers and are yet to extend to user-generated content. A rating system works well for console games that usually do not have user-interaction features. However, for games that involve multiple connected players and allow real-time interaction among them, an ex-ante rating system cannot evaluate the risk of exposure to harmful or illegal content created by other gamers. 

Lists of banned words and user report systems are widely implemented across games, but both have considerable limits. Banned words can be easily circumvented by inventing slang terms and can snowball into censoring content that is not necessarily harmful or illegal. Furthermore, report systems often suffer from overburdening, the inconsistency of decisions made by human moderators, and the algorithm’s failure to understand nuanced cases.

Apart from specific technical implementation issues, business considerations also affect content moderation in online gaming. A key problem is that gaming platforms demonstrate very different standards in content moderation and are not governed by a clear and consistent regulatory framework. For example, Nintendo is famous for its particularly strict content moderation policy due to its family-friendly brand, whereas other studios that produce mature-rated or adult-only games hold a relatively tolerant attitude towards deviant speech and conduct.

Future trends in regulating online gaming

Concerning the unique “interactional risk” in online gaming with social features, a major trend is to combine child protection law with legal prescriptions for online content moderation, since these vulnerable gamers have long been active participants in the industry.

Germany’s youth protection law amended in April 2021 now integrates the in-game communication environment into the reformed age-rating standard for video games, and those with unlimited chat functions will receive a higher age-rating. On the other hand, the UK Draft Online Safety Bill published in May 2022 also gives special focus to online content accessed by children, stating that platforms hosting user-generated content have tailored duties for minimizing harmful content’s presence, reporting abusive content against children, and assisting law enforcement departments if needed. 

In the European Union, another crucial change is to put the online gaming industry under the general regulations for online platforms that provide hosting services. The recent European Regulation on preventing the dissemination of terrorist content online and coming Digital Services Act (DSA) in the European Union, are also going to impact the online gaming industry, irrespective of gaming companies’ countries of establishment. 

Indeed, according to DSA, gaming companies will now be obliged to:

  • Set up user report or flagging systems, which enables submitting detailed and precise information about the flagged content; 
  • Set up complaint-handling systems, for processing complaints against their content moderation decisions (small and micro enterprises are exempt from this duty);
  • Disclose information about their content moderation policies, procedures, measures, and tools to users;
  • Publish transparency reports at least once a year, which should include number of cases processed, number of complaints received, type of measures taken against flagged content, etc.

More importantly, in cases of detection of illegal content, gaming companies are now expected to assume more responsibilities, including:

  • Process notices from trusted flaggers with priority and without delay;
  • Suspend accounts of frequent offenders and also of those who frequently submit unfounded reports of illegal content or complaints;
  • Promptly inform authorities and provide all relevant information, if they are aware of any serious criminal offense.

For “very large online platforms”, there will be extra requirements for risk assessment, independent audit, transparency report, etc., which may possibly affect major players in the market, such as Microsoft, Sony, Nintendo, and Steam, if the industry keeps expanding at the current rate. In response to the DSA, the European gaming industry is calling for more detailed and nuanced regulations to address the complex and diverse services in the ecosystem. However, one key trend is certain: online gaming platforms will no longer stay self-regulated without direct intervention from governments, and they will be held accountable for not investing enough effort in combating their users’ illegal speech and conduct.

Tremau Policy Research Team

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

DSA Compliance

What Does the DSA Mean for Your Business?

On November 16th 2022, the Digital Services Act (DSA; Regulation 2022/2065) entered into force. This regulation aims to create a safer digital environment where users, businesses, governments, and civil society can flourish. The DSA will apply to VLOPS and VLOSE by August 2023, and to all online services operating in Europe by February 2024. Acknowledging

DSA Compliance

Utopia: DSA scope in focus: I have a comments section, does the DSA apply to me?

Does having a comments section qualify my services as a hosting service? In short: Yes. Having a comments section on your platform qualifies your services, at minima, as a hosting serviceA hosting service enables individuals, companies and other service providers  to host websites, databases, applications.Within the meaning of the DSA, a hosting service  offers the

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.