Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Combat toxicity along with enemy players.
Starting today, the Call of Duty games will use AI to moderate voice chat behavior during multiplayer.
Voice chat moderation systems in Call of Duty: Modern Warfare II and Call of Duty: Warzone feature a beta version of Modulate's AI-powered ToxMod technology. Per the games' blog, ToxMod will "identify in real-time and enforce against toxic speech," such as discriminatory language and hate speech.
Following its beta period in North America for the two shooters, ToxMod will have its full global rollout (sans Asia) via Call of Duty: Modern Warfare III in November. Speaking to its use, the blog said ToxMod's inclusion will "bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team."
Toxic behavior has been a part of online games (particularly shooters) for years and, in some ways, is part of Call of Duty's reputation. But as the blog notes, Infinity Ward has worked to curb toxicity within Modern Warfare II specifically, as over 1 million accounts have had voice and text restrictions enabled due to conduct violations.
Xbox has made similar efforts to reduce player toxicity with more extensive reporting tools and bi-yearly transparency reports. However, those efforts have been conducted by Xbox's human-operated Safety team.
That one of the biggest shooter franchises is using AI for player moderation shows the technology could be additive to game development rather than used for (potential) outright replacement. However, this all depends on how accurate said technology is and how the Call of Duty developers use it going forward.
You May Also Like