Sponsored By

How to reduce antisocial behavior in your game.

Reducing antisocial behavior in online environments can be difficult. In this post I describe how theory and research from social science can help game designers understand norms.

Travis Ross, Blogger

November 2, 2012

7 Min Read
Game Developer logo in a gray background | Game Developer

Recently Bungie (Halo 4)  and NCsoft (Guild wars 2) have both taken a very aggressive stance on antisocial behavior in chat channels. Both companies have begun to ban players who use hate speech. Now, as far as I can tell banning such behavior can be difficult and I haven’t heard reports of how well it is going. Trolls and griefers can be crafty and may find other ways outside chat to reduce the enjoyment of the community. As these companies have taken some excellent steps toward getting rid of bad behavior, I wanted to help. In order to do so, I’ve compiled some suggestions for how to make more prosocial online communities based on my own research and that of others in the field of social policy. Also, for more writing at the intersection of social science and games check out my blog Motivate. Play. (shameless plug).

1: The overall goal should be to build community norms.

My dissertation focuses specifically on how game developers can use norms reduce antisocial behavior in the online communities in games.  Norms are powerful in that they emerge from community behavior. When a norm exists the community can reduce the cost of surveillance. If norms exist then a developer can rely on community members to report or even sanction (punish) players who are behaving badly. This can reduce costs for developers and even empower community members possibly increasing the feeling of self-determination.

2: There are two types of norms. Both are important.

That’s right, there are two types of norms. The first are known as descriptive norms – these communicate what other players are doing. In other words, descriptive norms can be observed in the behavior of others (in chat channels they are broadcast). This information is extremely important for developers because humans have a propensity to copy others, and to use relatively small amounts of social information to inform decisions and form scripts about their environments. What this means is that it is possible for antisocial behavior to spread through a community – especially if it is intrinsically motivating – and that a small amount of bad behavior can set off a chain reaction of bad behavior. Think flame wars, white-knighting, etc. When people get mad at trolls and griefers revenge or good intentions can actually lead to more antisocial behavior.

The second type of norm is called a social or injunctive norm. These are important because they put social pressure on individuals. They communicate what others expect. Yet, games often lack clear communication of these norms, which draw their power from shame and social expectation. In addition, it seems like players in online environments are less responsive to shame – as they probably recognize there are no lasting reputational implications. In the real world social norms are generally also accompanied by sanctions for transgression.

I want to make an important note about these two types of norms. They should be treated as separate motivational forces. In addition, research indicates that descriptive norms out-rank social norms. What I mean is that if descriptive norms communicate that antisocial behavior is common then social norms will not hold up. If enough people are behaving antisocially no one will believe that other people expect them to be prosocial. Interestingly existing social norms can unexpectedly collapse if the public perception of descriptive norms changes  – Scott Page talks about how this can happen.

3: Sanctions

Given that the worst trolls and griefers actually like when others respond to their behavior (see flame wars) social norms (the expectations of others) are not enough to stop individuals and may sometimes encourage them. To stop the worst trolls and griefers you must punish them by taking away what they enjoy. One option for punishment is for developers to pay police officers to sanction deviants. This can work, but when there are a lot of players or games policing can be expensive – telemetry and machine learning can certainly help.

Another option if for the players to do the sanctioning. After all, policing griefers and trolls isn’t dangerous and if norms of prosocial behavior are in place players should feel expectations to sanction others bad behavior. Sanctioning systems can actually be complex and difficult to implement. Why? Well if they are too powerful and easy to use then they become a tool for griefing (ironic). However, if they are too weak or costly then they won’t be effective or be employed by community members. I won’t go into all of the details about how to make a great sanctioning system – in fact there are still many questions for community designers and researchers to address. However, I will give an example – like any game design this probably needs some iterative testing:

Points for antisocial behavior

  1. Players can earn points for bad behavior.

  2. Other players can assign points.

  3. Points expire after an amount of time.

  4. Points are multiplied when multiple players in one session report a transgression.

  5. Players who cross certain point threshold are hit with a graduated sanction (sanctions increase with the multiplier). First take away voice communication, then take away the game.

  6. Players can file an appeal within x days.

  7. Other players are given tools to research an appeal (telemetry data) and are paid in virtual currency for answering appeals (three random players must review the appeal and come to a consensus. If they do not it is passed to an actual customer service agent. Players earn trust ratings for arbitration.

  8. Players that lose arbitration hearings earn addition points.

  9. Players that sanction a player who wins an arbitration earn points that reduce their ability to sanction – for a long period of time.

4. Should sanctions be graduated?

One thing that is still uncertain in community management and research is if players should be banned or if sanctions should be graduated. There are arguments for both.

Ban Hammer

First, it seems that descriptive norms for antisocial behavior already exist in the chat channels of many online games. “That’s just gamers being gamers.” Or “Antisocial behavior is normal in these games.” Creating norms for players to sanction bad behavior maybe difficult. And why would players sanction unless there are expectations that they should? There is something called the 2nd order free-rider problem where group members don’t sanction because there is a cost. To get rid of the perception that antisocial behavior is normative or OK, the ban hammer maybe required. In addition,  sexism, racism, foul language deserves severe punishment.

Graduated Sanctions

However, what if people can be rehabilitated? What if individuals simply are following the status quo for FPS? In competitive environments it can be difficult to control ones emotions and sometimes people get frustrated. Could this be a teaching moment? Do people deserve a warning? Could this actually help people be more prosocial in real life? One of the findings of research in social policy is that very severe sanctions can actually be detrimental to a community. It doesn’t allow for second chances and can frustrate or create enemies. This is especially the case when descriptive and social norms of a certain behavior don’t exist or are not clearly communicated. In other words when players feel like they didn’t get a warning or understand what was expected of them, but still get the ban hammer. After all antisocial behavior has been normative in these environments for some time.

Concluding

In conclusion, a significant amount of research exists that could help community managers build and sustain communities where prosocial behavior is normative. What I’ve talked about is really only the tip of the iceberg and there are still a lot of questions about behavior in online communities that need to be answered. It is up to game developers and researchers to keep trying to figure out how to construct societies that promote prosocial behavior.

If anyone things this is interesting and would like to apply it to their communities I’d be happy to talk about it in more detail – just leave a comment, tweet me or shoot an email. 

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like