Sponsored By

Spirit AI and the promise of automated content moderation

At GDC this year, Spirit AI pitched their “AI for humans” to developers, with the promise of AI that could serve as a game mechanic or a content moderator.

Katherine Cross, Contributor

March 13, 2017

14 Min Read
Game Developer logo in a gray background | Game Developer

There’s been a bit of to-do about automated content moderation, particularly over whether AI can sniff out online harassment and address it, with accuracy and alacrity that would elude overworked human moderators. Such efforts have had mixed results at best; while the promise of such AI for gargantuan global platforms like Facebook or Twitter is there, we’re a long way away from effectively stamping out their most pestilent cesspools.

But what about for a (relatively) smaller, more socially homogenous space like an MMO?

At GDC this year Spirit AI “de-cloaked,” in the phrase of one of its lead designers. From a sizeable booth positioned next to the IGF pavilion they pitched their “AI for humans” to developers, with the promise of AI that could serve as a game mechanic or a content moderator. 

The former involves natural language interaction similar to what one saw in Event[0]: an NPC that players can talk to in their own words by typing out comments and queries. They demonstrated this at GDC with an app called “Interrogation_” where players questioned a robot accused of murder in a near future sci-fi setting. While there were several unfused joins where the AI seemed to spontaneously wheel from topic to topic unrelated to the player’s input, there were more natural moments as well.

What I was really interested in, however, were the anti-harassment tools Spirit promised, and that was what I questioned Spirit’s developers about at length.

Ally is Spirit’s community moderation AI--if it lives up to its lofty and grand promises (a big ‘if’, to be sure), it would precipitate a revolution in how massively multiplayer games confront toxic behavior. Spirit pledges that Ally will be able to respond in real-time to cases of online abuse; an example I was shown involved a player being harassed by another who kept demanding a group invite. AllyBot popped into the victim’s messages asking if they were okay; the player responded ‘no’ and AllyBot proceeded to put the offending player on /ignore while offering a menu of other customizable moderation options, including filing a report to a human community moderator.

Ally is also supposedly more sophisticated than bots that simply match keywords, as those are easily circumvented or can lead to false-positives. Instead, Ally promises to detect categories of language on the part of both the offender and the target. Refusal, for instance, is one of the most important categories Ally uses to moderate: what is a player saying “no” to and why? Even if the offender is not speaking, or is attempting to circumvent keyword detection, if they are repeatedly told to stop doing something by someone else, Ally will take note of this and intervene.

This is also one of the cornerstones of its predictive analysis. A Spirit dev showed me a tool--which looks like a colorful, tabbed flowchart--that allows community managers to set their own triggers and rules for the AI monitor. For instance, modeling what racist harassment can look like, or how begging someone for gold might flow into outright harassment. But the AI can generate its own rules by looking at what players are refusing, and even if a refusal isn’t tied to a pre-existing category defined by the CMs it may be something that Ally absorbs into its knowledge banks as a behavior to monitor. This, I was told, was what would allow Ally to adapt to the local cultures and distinct toxicities of specific games.

Spirit’s developers told me that they were currently working with three live MMOs as partners, gathering data behind the scenes to beef up the AI’s threat-detection capacities. 

I asked if future MMO developers would be willing to integrate a tool that, in theory, might take months to ramp up to full effectiveness as it learns the patterns of a specific game. There’s no game for which the first month isn’t critical, but MMOs in particular must make a good impression that’ll keep players paying and playing in the game’s persistent world for the foreseeable future; does it help that cause if a centerpiece moderation tool isn’t effective until well after launch? Spirit’s devs insisted that Ally would be a great, well, ally, even at the point of launch; it’s been in development for eighteen months already, and Spirit’s MMO partners have been helping Ally learn for almost nine. 

We’ll just have to see what happens when Ally itself comes out of stealth-mode in these games. For the moment, there’s no bot interacting with players. Ally is simply operating behind the scenes learning about social patterns in MMOs and receiving rule-input from both Spirit’s developers and from the game’s community staff. 

Ideally, once AllyBot is running and interactive it would facilitate the development of granular rules that are customized to the desires and needs of each player--up to and including players who don’t want to ever see or talk to AllyBot at all. But Ally would be constantly monitoring player behavior as a whole: not only how they talk to each other, but where and how they move in the game environment. 

This is the basis for Ally’s other big videogame application: VR moderation. Through tracking movement, it can detect when a player is making sexual gestures at another player and collate that with other physical data, such as the target trying to get away. Without having to track or identify a single word, Ally can, in-theory, recognize that physical harassment is taking place and intervene. 

I was told that Ally would have been able to put a stop to the much-publicized incident in QuiVR where a woman was sexually harassed by a male player who engaged in virtual groping, chasing her through the game world. As intriguing as that is, however, the real test will be whether Spirit’s AI can do more than react to yesterday’s toxicity. It will have to be able to predict and adapt to novel and innovative ways that people engage in targeted harassment.

In the end this has always been the problem with automated moderation: enough people are determined to be hateful that they’ll game the system and exploit its weaknesses. The worst harassers on places like Twitter are often well-versed in where “the line” is and how to skirt around its edges without quite going over. For automated threat-detection on sites like Huffington Post, this led to a culture of nasty users spewing a range of words with characters replaced.

What gives me some measure of optimism about Spirit’s approach is that it’s not selling a set-it-and-forget-it system, but one that relies on human moderators to both set and adjust its parameters, and to continue to exercise judgment about what constitutes a material violation of rules. Ally is capable of escalating an incident to the attention of a human moderator, and without them no action will be taken against a player’s account. In this regard it operates similarly to Twitch's fairly successful AutoMod protocol.

The point of Ally seems to be to make the CM’s job easier, rather than outright replace them. But this also begs the question: what will it do that CMs aren’t capable of? How does it extend their reach or amplify their power? 

The aggregation of data seems to be the key, presenting a kind of global data map that will allow for quick but granular responses to emerging problems. There is also plenty of merit in making the reporting and ignoring process as low-impact as possible for players. Ally might also be able to facilitate the quick silencing of spam bots in high traffic areas, like people advertising the illegal sale of in-game currency--though that might require giving Ally more autonomy to ban players than it seems to have at the moment. Another application might be for gathering research data--as a social scientist who looks at online gaming settings, I can’t begin to describe how valuable such a comprehensive big-picture look at a game’s social dynamics would be.

A good tool doesn’t replace its wielder so much as amplify their capabilities. If Ally can do this for that much beleaguered class of community managers, it’ll be all to the good. But Spirit will need more than a bit of its namesake to confront the many challenges that await this system.

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like