Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
March 1, 2024
Voice moderation can cost much more than text moderation, so it’s difficult to judge if moderating voice chats brings enough value to offset those increased costs. But given the myriad negative impacts of toxicity on player retention, engagement, game reputation, and even legal liability, voice moderation turns out to be a shockingly net-positive initiative, bringing a return on investment in a matter of mere months. Let’s explore how the positive impacts of ToxMod add up into something truly transformative for games of all sizes!
Proactive moderation is crucial to flagging and reducing toxicity in online gaming environments. Developed by prosocial voice technology experts at Modulate, ToxMod is gaming’s only proactive voice moderation solution designed with player safety and privacy in mind. ToxMod’s advanced machine learning models triage voice chat data to flag bad behavior, analyze the nuances of each conversation to determine the most likely instances of toxicity, and enable moderators to quickly respond to each incident by supplying relevant and accurate context. Ultimately, ToxMod unlocks a host of benefits for gaming communities by making voice chat more safe and immersive for everyone.
ToxMod supports some of the top titles across social and competitive genres, including Among Us VR, Virtex Stadium, Breachers, Call of Duty, PokerStars VR, and RecRoom. Across our supported titles, we track the impact of ToxMod deployment on player retention.
Based on ToxMod data pulled from across titles focusing on detecting and responding to severe or targeted toxicity, we know that 35% of players who were exposed to toxicity leave the game, translating to about 10% of a game’s total player base churning each month due to toxicity. This aligns with the 2022 ADL report showing 28% of players who reported leaving a game after experiencing toxicity and Unity’s 2023 survey, which found 43% of players churn after being exposed to toxicity.
How does player retention improve with the implementation of ToxMod? The widely cited industry standard for D30 (percent of players that are still active after 30 days) retention is 5-10%. But once games integrate ToxMod, many of the 10% of players who would otherwise leave due to toxicity end up sticking around - resulting in a 7-15% uplift to D30 retention. In other words, if your previous D30 retention was 9%, you’ll end up with roughly 10.5% D30 retention within the first few weeks of turning on ToxMod!
We also noticed similar increases in users who had previously stopped playing but returned to the game after deploying ToxMod, with a 2-5% boost to long-time players’ engagement.
While exact degrees of toxicity vary between games, even relatively safe games like Minecraft see only 31% of adults reporting they felt comfortable genuinely expressing themselves “sometimes or rarely” and 11% never feeling safe to express themselves at all. Of course, these numbers can be much higher for mature or competitive multiplayer games like League of Legends.
What does a reduction in toxicity mean for player engagement? Studies have repeatedly shown that players spend more time and money in safer gaming environments. For example, research by Take This found that 61% of players choose to spend less money in games due to experiencing hate speech or harassment. Furthermore, 39% of players reported that they never or nearly never spend money in games after experiencing severe toxicity. A study by UC Irvine’s Constance Steinkueler found that players spend an average of $21.10 in non-toxic games and only $12.09 in toxic titles in equivalent genres. That means less-toxic games can experience a 54% boost to in-game spending – that’s a big increase!
Because toxicity is a significant threat to gaming environments and gamers’ safety and wellbeing, regulatory bodies threaten penalties for studios that insufficiently moderate toxic and harmful behavior.
Both the U.S. Congress and Australia’s eSafety commission have inquired into several game studios’ safety approaches to inform stronger regulations. India and Singapore have passed more stringent internet safety laws which impose a stronger duty of care on platforms. Failure to comply with the EU’s Digital Services Act (DSA) can result in penalties of up to 6% annual turnover. For the UK’s Online Safety Act, penalties can reach up to 10% annual turnover – these are huge fines for a studio of any size.
These fines aren’t just theoretical or coming soon. In late 2022, Epic Games paid $275 million in a settlement with the Fair Trade Commission (FTC) for violating the Children’s Online Privacy Protection Rule (COPPA). However, that penalty wasn’t just due to mishandling children’s personal data – Fortnite’s lack of safety protections for minors was equally relevant.
For gaming studios to meet the emerging standards of international regulatory bodies, it’s important to demonstrate rapid analysis, clear and consistent moderation rules, and proactive detection of illegal harms. Modulate works closely with game studios, as well as regional and international regulatory agencies, to support clients with up-to-date regulatory guidance in tandem with our state-of-the-art technology.
Human moderators have the impossible task of sifting through player reports, audio files, chatrooms, and forums to ensure that players are adhering to community standards. Without the ability to prioritize high-risk reports, moderators can drain time and energy on the wrong tasks. ToxMod sifts through audio data directly to bring the most egregious violations of a game’s code of conduct to the top of the queue, helping moderators take action on the worst offenses first.
ToxMod also lets moderators quickly remove unactionable player reports. Take, for example, a falsified player report that claims someone else was using hateful language in voice when in fact the alleged offender was never even using voice chat. ToxMod can quickly surface these details so that moderators don’t waste time finding associated audio clips. This further helps moderators review and action legitimate reports 5-10x faster.
In addition to the time needed to review problematic content and player reports, moderators are also forced to read and listen to harmful and sometimes extreme content for many hours at a time, which understandably takes its toll on moderators’ wellbeing. ToxMod is a powerful tool to help moderators more quickly hone in on the important details of a case and therefore spend less time being exposed to potentially harmful and mentally draining content.
With improved technology to combat toxicity, players and media outlets are less likely to tolerate toxic gaming environments and are looking to gaming studios to provide safe, enjoyable gaming experiences.
While players have raised questions about voice moderation, Modulate’s ToxMod solution is a known and trusted brand. ToxMod customers routinely monitor for negative sentiment or player churn when they first integrate the proactive voice moderation technology. Rather than players leaving the game upon ToxMod’s launch, the opposite has been observed – all ToxMod-enabled titles to-date have seen their player base range from neutral to positive reactions. Many even see previously churned players return after launching ToxMod.
In addition to these benefits, most studios earn back the full annual cost of ToxMod in mere months after deployment. Let’s break it down with a hypothetical:
For a game with 1 million monthly active users (MAU), ToxMod might cost approximately $10,000 per month or $120,000 per year. This is just an illustration of course – because ToxMod's pricing is based on audio usage, these numbers are a ballpark estimate for the typical competitive game with 1M MAU. In this scenario, we estimate this game should expect approximately:
350,000 toxicity offenses each month from roughly 120,000 offenders.
400,000 players experiencing toxicity with 125,000 of them subsequently churning.
But at $1 average revenue per MAU, this means preventing these players from churning could save gaming studios $125,000 each month, more than ToxMod costs for a full year in this case. In other words: ToxMod quickly pays for itself.
Read more about:
Sponsor Resource CenterYou May Also Like