Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Toxicity is eroding video game communities. Churn probability of first-time players can rise to 320% when exposed to harassment on their first session. In the following blog, I explain how you can protect your players and reap the benefits doing so.
Conversion, software 7.0
This article was originally published on The PX Hub, my personal blog specializing in content on Player Experience in game development. Find insights, best practices and handy tips to help you ensure your players stay happy and engaged.
Toxicity is eroding video game communities. When CD Projekt Red’s developers received death threats over another delay of Cyberpunk 2077, the media and our industry reacted astonished by the level of vitriol spouted online. It was not too long ago that angry trolls harassed the cast of The Last of Us Part 2 and Laura Bailey received death threats. This begs the question - if those who create games are on the receiving end of such online depravity, what do players experience?
Many might argue that toxicity only involves high-profile studios or games with extremely competitive gameplay. The rabbit hole goes a lot deeper, unfortunately. Deviant behaviour in video game communities also has an impact on Player Retention and Life Time Value (LTV). Even more so, a study on League of Legends from 2014 suggests that players who face exposure to abusive language or harassment on their first session, are 320% more likely to churn immediately and never return.
To curb toxicity, we first need to understand it better.
A few months ago, Vice ran an article written by Jess Morrissette on how games marketing invented toxic gaming culture decades ago. Edgy advertisements promoting trash talk sold toxicity and harassment as value propositions for online gaming. Studios, publishers and platforms, all alike. Not even John Romero’s renowned Daikatana escaped the trend of smack talk marketing, which later on he deeply regretted and apologised for. Since then, games marketing has fortunately taken a turn for the better. However, online toxicity has remained, providing a real challenge many great studios and Player Experience leaders face daily.
A recent study revealed toxicity runs rampant in free-to-play games. According to the results, 74% of US online gamers have experienced some form of harassment when playing online. This survey was conducted by the Anti-Defamation League (ADL) in collaboration with games analytics firm Newzoo.
Photo by ÐÂлекÑÂандр МакедонÑÂкий from Pexels
65% of people playing video games online have experienced “severe and/or sustained harassment, such as physical threats and stalking”.
53% of those who reported harassment said they were targeted for their “race, religion, ability, gender, gender identity, sexual orientation, or ethnicity.
29% of people surveyed reported they were doxed (the act of publishing personal or identifying information with malicious intent) at one point while playing games online.
23% of people surveyed reported “exposure to extremist ideologies and hateful propaganda”.
9% of people surveyed said they experienced “exposure to discussions about Holocaust denial”.
A clear expectation has been set when more than half of the surveyed players believe that video-game studios are responsible for and should do more on player safety, inclusiveness and content moderation.
Despite efforts made at Riot Games, who were for years “leading the charge” by implementing a variety of solutions to curb toxicity in its community, we still see reports surface that show toxicity remains problematic. And even when Valorant, Riot Games’ first big game in over a decade, launched with lacking anti-toxicity features, the company seems determined to look for ways to make the online player experience more inclusive.
But Riot Games is not alone in this. We see companies like Blizzard and Ubisoft fighting back hard against toxic behaviour, both reactive and proactive. Valve too has stepped up its game, releasing auto-mute features for voice chat in CS:GO and turning to AI to moderate and contextualise online verbal abuse. And then there is Amazon, who caught everyone off-guard by patenting a mechanism for isolating players into separate pools based on player behaviour.
It comes as no surprise that building strong communities has a very strong influence on player retention. Facebook not only reports that online social connections and a sense of belonging affect retention, but also affect spend. If that is the case, does it not sound logical that providing anti-toxicity features improves the overall player experience and drives a higher Life Time Value (LTV)?
Two Hat, a company that provides AI-powered content moderation and online safety platforms, believes so. They released a research paper based on the in-game community data of a Top 10 mobile Action RPG title (which you can download here). The research itself states that players participating daily in moderated chats show an increased LTV by up to 20 times. Additionally, it also shows an increase in the number of daily sessions by 4 times, and that average session length grew by 60%.
So what can gaming studios do to protect their community and brand while banking on more highly engaged players? Anonymous online communities are not toxic by nature, according to research by Rebecca Chui, lawyer and former Microsoft user experience designer. The research also points out that online toxicity and a culture of harassment requires community norms that allow for it. So how do we address that?
Determine the Community’s Voice early on and create Community Guidelines to set the tone on what is acceptable.
Make sure to provide a Parent Portal and Parental Controls when the audience requires parental guidance.
Invest in a moderation tool that proactively filters undesirable content when you allow user-generated content (names, profile pictures, chat) in the game.
Provide Block, Ignore, Mute or Report functionalities as an added safety feature in games.
Invest in professional Community Managers and Moderators for prioritisation and triage of content queues, increased efficiency and improved ROI in Community Health.
Implement an endorsement or user reputation system.
Encourage positive and inclusive behaviour through game design and game experiences.
Team up with Influencers that exhibit positive behaviour.
Create a studio culture of Inclusion and Diversity. It will help you make community-related decisions with inclusivity in mind.
Join other video game studios at the Fair Play Alliance.
When managed properly, social features and in-game communities can bring significant benefits in engagement and increased LTV. Left unchecked, however, they may pose a risk to your audience, brand and reputation.
What are you doing to keep your players safe? Let me know in the comments below!
If you enjoyed reading this article, make sure to head over to The PX Hub to read more of my thoughts on Player Experience, follow The PX Hub on LinkedIn or subscribe to the newsletter.
Read more about:
BlogsYou May Also Like