Content Moderation for Online Gaming Communities
When it comes to content moderation, gaming platforms are currently undergoing a tremendous paradigm change. To combat the isolation of lockdown, many gamers have found a much-needed outlet for human connection through online games and the communities built around them. This has led to a ‘pandemic gaming boom’ as an astounding 4 out of 5 US consumers reportedly played video games in the last 6 months of 2020.
At a time when many sectors of the economy are struggling, the video game industry is thriving. Global revenue this year is expected to jump 20% to $175bn (£130bn). Studies have shown that playing video games can have a positive effect on mental health and can help keep the mind and reactions sharp, so it’s no surprise gaming has become many people’s favorite lockdown hobby. The old stereotype that video games are isolating, and gamers antisocial has been swept away as people turn to online games to build friendships and stay connected. The exceedingly popular Nintendo game, Animal Crossing, has played host to online birthday parties and even a wedding that had to be canceled in the ‘real world’ due to social-distancing restrictions.
However, like many communities, online gaming has its share of those who seek to offend, spread hostility, or act in a way that’s harmful to others on the platform. It’s the job of gaming moderators to step in and ensure these communities are an enjoyable and safe space for gamers of all ages and are free from harassment and toxicity.
1. What is gaming content moderation?
Many gaming consoles today come with safety settings, more commonly called parental controls, that allow parents or guardians to set time limits, block inappropriate games, and determine with whom users can interact. However, many online multiplayer games such as Fortnite, League of Legends, Call of Duty, and Overwatch, rely on interpersonal communication to coordinate strategies, much like a real-world sports team. The safety settings and controls on gaming consoles do not monitor the conversations within such games. While the vast majority of interactions will be entirely appropriate, there is the potential for gamers, particularly younger ones, to fall victim to various kinds of bullying, abuse, cheating, harassment, or even grooming. The entire purpose of a platform and community is reflected in strong community guidelines. They are, above all, the expression of what that online environment stands for. The success of a gaming community hinges on the experience of its users and a large part of whether or not users have a positive or negative experience depends on the behavior of other players. Gaming should be a chance for people to escape their troubles and worries for a couple of hours, particularly in difficult times like these. Bullying, personal attacks, harassment, and racist or sexist comments are antithetical to that desire.
The entire purpose of a platform and community is reflected in strong community guidelines. They are, above all, the expression of what that online environment stands for (Pure Moderation – gaming content moderation partner)
1.1 Gaming content moderation applies to gaming forums, live gaming, and video streaming services
The success of a gaming community hinges on the experience of its users and a large part of whether or not users have a positive or negative experience depends on the behavior of other players. Gaming should be a chance for people to escape their troubles and worries for a couple of hours, particularly in difficult times like these. Bullying, personal attacks, harassment, and racist or sexist comments are antithetical to that desire.
Gaming moderation ensures that users of a particular game or forum act in an appropriate, non-offensive manner, and abide by the rules and guidelines of the community. An effective moderation system can also identify and ‘boot’ hackers and cheaters to maintain the game’s integrity and make it possible for gamers to enjoy a fair, wholesome, and immersive gaming experience.
Gaming moderation can be applied to live gaming, gaming forums, and live video streaming services used by gamers such as Twitch, Periscope, or Vimeo. Broadcasters on these platforms employ the services of moderators to ensure their channel’s chat meets behavior and content standards by removing offensive posts and spam that detracts from the quality of their stream.
1.2 Chat Room Moderation
Many gaming platforms have an in-game chat option that allows players to speak and chat in real-time, in groups, or privately. Some gaming firms employ a proactive filter that, in real-time, tries to censor content associated with cyberbullying, hate speech, personally identifiable information, and sexual language. However, that is not a perfect system. Simply put, technology struggles to comprehend the nuance and social context of voice or text and, as a result, might make incorrect interpretations of intent. Fake news, misinformation, satire, and other context-dependent content do not have simple definitions, and there are grey zones for each of them. Depending on one’s upbringing, personal ethos, or mood, one definition may differ from another.
2. Things to consider when developing a gaming content moderation strategy
A well-moderated group chat is integral to a positive gaming experience and, therefore, is a key factor of user retention, which in turn can have a large impact on revenue.
In 2020, as lockdown restrictions took hold across the globe, chat among gaming platform users increased dramatically as more and more people discovered one of the only ways to connect with people was through technology and digital solutions.
Human-only moderation teams found the upsurge difficult to handle and while AI moderation exists, it often has trouble identifying the nuances surrounding online abuse and harassment. So, what is the best approach to take for gaming content moderation?
As we’ve mentioned before, AI moderation can’t do it alone and human moderation can be greatly aided through the application of machine learning, particularly during periods of peak activity. That means, first and foremost, a partnership between tech and human moderation is required for an effective solution.
Next, before implementing any kind of moderation strategy on a platform, there will be a need to identify and understand what kind of community it should be. What are the demographics of the target user base? A real-time combat strategy game like Call of Duty will usually have its fair share of heated discussions, hazing, and banter. In a more passive and constructive gaming environment, such as the aforementioned Animal Crossing, such talk might seem offensive or aggressive. Terms like hate speech, sexual harassment, and abuse need to be carefully defined and made clear to your community right from the start. It’s also important to continually follow up with users and track discussions in forums to evaluate how much they are enjoying the game and community. At the same time, the need for users to feel secure must not interfere with the flow of the game. The action needs to be thrilling and unfold in real-time without play being halted for every minor infraction.
Lastly, as our online gaming worlds become more and more complex and detailed and as we head into the future of ‘the metaverse’, gaming content moderators will need to be highly specialized and comprehensively up-to-date with the latest technology. There is also the very real need for moderators to be protected, emotionally and psychologically, and receive dedicated training for handling and coping skills.
3. The role of a gaming content moderator
Gaming content moderators are responsible for, first and foremost, ensuring that the forums, chat rooms, and the game itself have a nice welcoming atmosphere and that gamers will want to return to the platform time and again. As online multiplayer gaming becomes more popular, a series of best practices have come to the fore.
Moderators need to take a proactive approach to keep online communities safe from harmful content. This requires sophisticated moderation tools to help with increased workflows, such as in-house filters and AI-fueled content moderation tools, but with a real person giving the final judgment on sensitive situations.
When sanctions need to be applied, moderators will need to apply them quickly, with a typical progression flow might include some steps:
A warning can be issued for a first-time offense and can involve removing the offending post (if posted in a forum) and a reminder to the user of the guidelines of the community.
If a player is guilty of repeated infractions in chat rooms or on forums, moderators can take away their ability to communicate with other players. The length of time will be dictated by the severity of the infringement.
For players who fail to respond to repeated warnings from moderators, a suspension, or ‘kicking’, from the game can be applied. The length of the exclusion from the game will be influenced by a series of factors, such as the severity of the transgression or the recalcitrance of the user. For more severe cases, a report may be filed and reviewed and additional sanctions may be applied against the player’s account.
A ban is when moderators remove a player’s access to a game or their account for an extended period of time. Bans are issued due to repeated or extreme violations of terms and conditions or rules of conduct. Account bans are typically permanent.
4. Your reliable gaming moderation partner: Pure Moderation
Every platform that handles user-generated content needs to have a moderation process in place. Gaming platforms are no different. The moderation process protects users and gaming communities and ensures the pleasurable escape that gamers yearn for. Human interactions are complex and nuanced, so this isn’t an easy task. For the first time in gaming history, we are witnessing multiple generations of gamers interacting online with each one, for better or worse, employing a different set of norms and standards of conduct. Not all groups see things the same way or agree on what’s dangerous or toxic behavior.
This is why we ensure moderators at Pure Moderation are fantastic communicators, with solid baseline skills and experience to work with gamers from disparate backgrounds. We provide our moderators with extensive training so they can deliver seamless moderation that adds to the gaming experience rather than take anything away. Our moderators can also track user sentiment and produce highly actionable insight that can boost customer retention and referrals.
Hate speech, abuse, and violence should not be accepted as the price we pay to play games on the Internet.
To learn more about outsourcing your gaming content moderation processes and the value that it can bring your business, contact Pure Moderation for a free consultation and trial.
💬 Chat with us on our website for a free consultation and trial.
or Email: firstname.lastname@example.org