Sponsored By

Online community and culture wars: What do we know?

What can we learn from the significant community challenges the gaming community faces? Online gaming veterans Raph Koster, Gordon Walton and Rich Vogel share fascinating sociological lessons.

Leigh Alexander, Contributor

March 2, 2015

8 Min Read
Game Developer logo in a gray background | Game Developer

What's been going on with the gaming community? Raph Koster, Gordon Walton and Rich Vogel are veterans of the online gaming space, and know a thing or two about how groups form and behave on the internet. At GDC, the three presented important findings for community managers about how to gain control over an increasingly depressing work environment.  

We now live in an age where the internet filters results for you based on assumptions about what you're like drawn from geographic location or other patterns. This creates a phenomenon called a "filter bubble," says Koster, where increasingly one's perception of the world is led by online targeting. Your average online user will increasingly see only those news sources and even political candidates that agree with their own views -- and anyone who's ever Facebook-blocked a relative with offensive political views has become complicit in this sort of filtering. 

In this climate, says Koster, the common context shared by disparate groups begins to erode, and homogenous groups crystallize. "As noble as we wish we are, we're not -- given the choice, people hang out with people like them," says Koster. 

"Given a limited population, over time, not only will we [form] groups that are like us, but the larger group will exterminate the other one," he says. "In simulations, that's what happens: They literally commit genocide, they literally chase everyone else out of the room. It's a distasteful fact about human nature, and if our definition about who we are is rigid, then you're going to have that conflict." 

People tend to make assumptions about behavior based on character, when in fact behavior is contextual and based on complex and deeply-felt beliefs. The theory goes that people are most likely to treat another person well when they feel they will see that person again. Trust is established through repeated interactions.

"This is how the world works, and some of this is uncomfortable. We doing community management have to deal with that, and part of the problem is that a lot of this has changed out from under the best practices we talked about 14 years ago," Koster says. "A lot of the things we take for granted as best practices just don't work anymore." 

For example, look at what free to play systems have done to the idea of persistent identity in games -- it's rooted in pulling in as many accounts as possible and churning through those that won't turn into paying users, and each time you log in you're part of a different group, with no attachment to your online identity or that of others.

"Without friction of some type, you end up in a place where it's difficult to create a peaceful community," says Walton. 

"You can do anything you pretty much want, and that's a problem for communities in free-to-play," adds Vogel.

The disappearing best practices

Our previous ideas about managing online communities or players were rooted in that persistence, the barrier to entry for account creation and maintenance, and systems of reputation and reward for good practices. Those elements are the "good fences that make good neighbors," as the saying goes. If players don't need to be invested in what others think of them, they're less likely to do their part to keep a community healthy. 

"In the real world, if you want to be a criminal you go to a big city, not in a town of sixty people," Walton adds. That's why large communities where users are most often anonymous tend to have the biggest culture problems. "What we've seen from the beginning is that scale matters. Everything we're talking about is fundamental human behavior." 

Without clear lines and well-bounded communities, people can become confused in a way that leads to conflict. For example, with Kickstarter and Early Access games, users become hostile and controversies arise because the distinction between a "customer" and a "funder", a creator and a user, is indistinct; confusion about different types of roles and relationships within the game industry gave rise to last year's "controversy that shall not be named." 

When you look at modern communities, from Twitter and Reddit to Facebook and chan boards, all the best practices -- keeping identity persistent, having a meaningful barrier to entry, specific roles, groups well-bordered and full anonymity at least somewhat difficult -- have been thrown out the window, giving rise to toxicity online, the veterans say. 

Facebook has held steadfast against user demand for a "dislike" button, and Koster says this is actually a wise move: The ability to downvote can destroy communities, he believes. Heavily-censured often abandon accounts and start again without consequences or responsibility, and in bigger cases, as with Reddit's brigading wars, people band together to destroy communities they dislike, downvoting them into oblivion. 

Mobbing behavior is also an issue online, from online harassment to call-out culture. Mobbing tends to single out those who are different, and working in fields with ambiguous standards increases one's risk of being mobbed. Of course, one person's harassment is another person's activism, and when people band together in opposition to negative behavior, it can be a positive experience for some. 

But for community managers, especially in games, mobbing is nearly always a bad sign, Koster warns: "It means you have a civil war on your hands, with heavy-duty consequences. Mobbing hurts a lot: Depression, sleep loss, weight loss, drug abuse, panic attacks... the thing to understand is how serious this can be for a victim," Koster says. "Think about the developers you work with and your own job, and this is part of why the churn rate for community managers is ridiculous. And what happens to your community if you allow this kind of thing to happen?"

What you can do

Perhaps there are no firm solutions just yet. But some things definitely work: Subdividing communities into manageable sizes and subcommunities -- no more than 150 regular commenters per forum, for example. Within subcommunities, minimize the reasons for different groups to be at odds, and offer "good fences" -- proper interfaces among communities, and meta-communities that serve to connect smaller communities. Offering a "honeypot," such as a single "general discussion" space that's allowed to be a nest of bad behavior, helps keep conflict-seeking users entertained and away from the others. 

Work around, not against anonymity -- a "pseudonymity" with private reporting channels and persistent logins is preferable. Incentivize positive contributions through some kind of reward or reputation system. 

Encouraging downvoting, brigading or other negative feedback loops is a design mistake, the veterans warn. So is driving intentional tribalism -- you want communities to feel strong and proud, but avoid offering ways to create competition within the userbase itself. Instead, emphasize ecological thinking. Failing to provide identity structures that offer common ground among groups is also a "don't." 

While it may be difficult, put yourself in the other's shoes. "This is hard, but I think community managers in particular need to investigate the cultures of all the different constituencies that are coming into their game," advises Koster. "Because for better or for worse, you are the translators that need to speak all the languages, and tell them what is okay there."

"Softening language" can be helpful -- phrases like I'm afraid or in my opinion. Speak positively to frustrated people -- "we prefer this type of behavior" is a better tactic to reduce problems than "you're doing something wrong." 

Use "I" and "we" statements instead of "you" in general. Find ways to show the other person that you are truly listening to their point of view, like repeating their arguments back or saying "I hear you." 

Be firm: "You're in our house and there are rules." This is a way of firmly dealing with users who are acting out, without having to field their accusations of tone-policing. Be sure to have the rules clear and visibly posted -- and be careful that moderating a user won't unecessarily upset too many others and make a situation worse. 

And finally, take precautions to protect your own personal information as a matter of self protection: Unlist your numbers, get domain name WHOIS protection, disable your phone's geolocation features, and request removal from databases. This will ensure you're safe if things go wrong and you become a target. 

"I don't believe there's such a thing as a private social media presence anymore," Koster says. 

"Everything we do gets picked up by the press now, and they consider it PR," adds Vogel. "So there's a real balance about how you manage that internally, and getting approval for what you say on the forums. The time that takes is going to be a hurdle. You're a community now, not making single-product games, and it's a big learning curve for publishers." 

"I know this seems like a bummer in the end, and maybe these solutions aren't enough. We live in a rapidly-changing world, and I hope people [like Facebook and Twitter] realize things need to change, that these things have been mis-designed and encourage poor behavior," says Koster. "If you do end up on the pointy end of a mobbing, most of this is not going to be helpful enough. The best advice I have is to disconnect completely, and take care of yourself first. The battle will be there another day." 

Read more about:

event-gdc
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like