Emerging Technologies

How we can tackle toxicity to create a more inclusive gaming environment

A person involved in gaming.

The gaming environment can be made safer Image: Axville/Unsplash

Akash Pugalia
Global President, Media, Entertainment, Gaming and Trust and Safety, Teleperformance
Farah Lalani
Global Vice President, Trust and Safety Policy, Teleperformance
Maria Polchlopek Mercado
Social Media and Cx Director, Teleperfomance
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Emerging Technologies

  • The gaming industry could be worth $321 billion by 2026.
  • But the cost of toxic content and abuse is steep for players and the industry.
  • We must forge a path towards a video game community that celebrates diversity, fosters inclusivity, and puts an end to toxic behaviour.

We all remember the simpler days of gaming: offline games powered by the excitement of in-person encounters. Today's scene is dramatically different in how and where players interact, from online encounters to multimodal and on-the-go platforms. The gaming industry is tipped to maintain its recent rapid growth and could be worth $321 billion by 2026.

With this growth, there have also been several challenges for the gaming industry, not the least of which is toxicity. Reach3 Insights, a market research company, stated that “77% of women gamers experience gender-specific discrimination when gaming, including name-calling, receiving inappropriate sexual messages, gatekeeping and dismissiveness.” The cost of toxic content and abuse is steep for players and the industry. Another report found that “28% of online multiplayer gamers who experienced in-game harassment avoided certain games due to their reputations for hostile environments and 22% stopped playing certain games altogether.”

The gender bias in gaming was highlighted by Maybelline NY, the global cosmetics brand, which ran an experiment to see how male gamers would be treated when their voices and profiles were modified to appear female. The experience of the two regular gamers was dramatically different from what they were used to: some peer gamers abandoned the game as soon as they heard feminine voices, others started spewing inappropriate messages, including insults and invitations to go back to perform “female chores.”

While progress is being made when it comes to tackling such toxicity, challenges still exist. Enhancing trust and safety in gaming is critical for inclusive evolution.

Discover

What is the Forum doing to improve online safety?

The state of content moderation in online video games

Trust and safety are not new in the video gaming world, however, there is no standard practice across the industry. A lot of room has been left for policy development and interpretation. A good example of this situation arises in the discussion of proactive versus reactive moderation, with organizations debating how to face and control user actions without losing the natural autonomy of the game. At the same time, a lack of policy proactiveness could mean exposing gamers to inappropriate content while waiting for a proper report or escalation.

As content moderation practices develop in the industry, here are some of the opportunities our team identified:

1. Comprehensive, yet simple, articulation of policies and enforcement actions

Where content moderation policies exist in the gaming universe, some policies lack coverage, definition or clarity in areas such as dehumanizing speech, incitement to violence, extremism, and policies written in a way that is not easy to understand. Stakeholders need specific and easily understandable reasons for any actions that the company takes on their account or their content so they can improve their behaviour in the future. According to ADL, 59% of adult gamers believe that regulation is necessary to increase transparency around how companies address hate, harassment and extremism.

2. Looking beyond content policies

We need to move from a content focus to an environment focus that takes a broader view. In multimodal games, for example, we need to look at voice, account creation characteristics and other signals to try and focus efforts on the user groups that are responsible for most of the toxic behaviour in games.

Focusing on these signals can immediately remove a lot of hateful content in a game, given that this behaviour is concentrated in a small minority of users – typically less than 3% of users are responsible for 30% to 60% of toxic content across all types of platforms. If the small minority of players causing toxic content is dealt with, smaller touches, nudges, warnings, and other forms of group behavioural modification can be used for most gamers on platforms to stop any remaining toxic behaviour.

3. Operationalization

When it comes to safety, much of the user experience can be attributed to the way that video game policies are defined and operationalized. Moderation of video game content will improve if policies are available, in all languages, for markets where the platform is active; if policies are applied consistently across all users; if action user reports are promptly adjudicated; and, if there are quick and effective mechanisms of redress for people who disagree with content moderation decisions. Platforms must be nimble and modify or augment their policies frequently given the velocity of changes we have seen in the field in such a short period.

Have you read?

Opportunities for solutions: building a safer video games environment

The good news is that there are opportunities for growth and improvement. One of the first recommendations to new operations starting their trust and safety journey is to build a diverse moderation team that reflects the gamer base. According to a study by the International Game Developers Association (IGDA), diverse teams are better equipped to understand and address the unique challenges faced by different groups of players. “Unintended bias towards specific groups of people, topics or context may be due to representation deficiencies or to the lack of social diversity in the group of moderators.”

Diversity in moderating teams has a quantifiable benefit to platforms and user experience. It has been established that the absence or reduced representation of women in the moderation teams has a direct impact on the experience of the final user, in this case, gamers.

As the user base of gaming platforms broadens, addressing implicit bias in application of policies using diverse moderating teams will benefit users, increase gamer loyalty to platforms, and further platforms’ growth. This is also an opportunity to remind us that diversity must transcend content moderators and should permeate all levels of the organization. Ultimately, it is the work of many that will ensure the reshaping of gender norms across the ecosystem. “Rule-setting is subjective and reflects the biases and worldviews of the rule-setters.

As we delve into the natural evolution of content moderation operations, the first step is to set clear and transparent community standards and codes of conduct, which all gamers must agree to before participating. Core principles are transparency and strong communication to reinforce awareness of the consequences of their actions.

Mike Pappas, CEO and co-founder of Modulate, suggests any external-facing community standards or codes of conduct be crafted for a broad audience, saying: “Many kids or other players, especially in games that match folks from very different cultural backgrounds, may simply not know which behaviours are or aren't toxic. When we talk about toxicity, the image of a ‘troll’ comes to mind, but more than 50% of disruptive content actually comes from these kinds of honest mistakes. This means a clear, readable, non-legalese Code of Conduct can be an incredibly powerful tool for level-setting community behaviour expectations and cutting down on toxicity.”

This is also a crucial time to determine the protocols of reporting, empowering gamers to report toxic behaviour and fostering a culture of accountability. According to a survey by the Fair Play Alliance, 85% of players believe reporting systems are crucial for a positive online gaming experience. Developing the right workflows for reporting violative content from the get-go ensures a green start for new platforms and a pathway for sustained success.

In parallel, designing and implementing effective penalties for offenders, such as temporary or permanent bans, can act as a deterrent. As mentioned before, categorizing different types of offences and deploying measurements accordingly will educate the community and create shared accountability. Consistent enforcement of these penalties, including nudges, warnings, other forms of group behavioural modification, and suspensions or bans sends a clear message that harassment will not be tolerated.

Leveraging artificial intelligence (AI) and machine learning algorithms can assist in content moderation. Automated tools scan chat logs, voice communications and in-game actions to identify and flag inappropriate behaviour. AI tools may be integrated to screen out toxic content directly and route content for human intervention. However, these solutions should always work in tandem with human moderators to ensure context is considered accurately.

Trained moderators still outperform large language models, especially for edge cases, or where additional information is needed. For example, in areas requiring more linguistic or cultural context, such as hate and harassment, there is still over a 20% gap in performance between AI and human moderators according to one data set from Open AI. Some AI-based tools are designed to allow humans and AI to complement one another, with the AI helping prioritize where the moderators should devote their attention, but allowing the moderators the final say on what, if anything, should be done.

Trust and safety at the core of online video game growth

As the gaming industry continues to grow, it's imperative that gaming companies invest in comprehensive content moderation strategies that prioritize player safety and well-being. By leveraging technology, community engagement and diversity, the industry can work towards a future where gamers of all backgrounds can enjoy their favourite titles without fear of harassment or abuse.

The responsibility to create a safer gaming environment lies with everyone involved, from players, developers and policy-makers to moderators. Together, we can forge a path towards a video game community that celebrates diversity, fosters inclusivity and puts an end to toxic behaviour.

Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Robot rock stars, pocket forests, and the battle for chips - Forum podcasts you should hear this month

Robin Pomeroy and Linda Lacina

April 29, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum