Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
Web 1.0 Site Moderation is no longer viable in a Web 3.0 world. There is too much data and there are too many concurrent users to expect humans to keep up with massive content exchange 24/7. Community Management software and humans are needed to succeed.
Q. What has 100,000 legs, 500,000 fingers, and spends at least 13 hours a week playing minimally supervised online social games?
A. 50,000 concurrent gamers on a popular MMO.
The numbers of concurrent players logging in to popular online games has skyrocketed in the past 10 years. Unfortunately, online moderation methods used on many sites have not kept up with the industry. And, for many years the industry did not provide commercial moderation tools.
The Web 1.0 method of using swear filters and scheduling a handful of Game Masters or Moderators to work during peak hours can no longer meet the needs of 3,000 or more concurrent users in the online social gaming world. Even with a ratio of 1:2000 (one GM to every 2000 concurrent users), a busy site would need 25 GMs per peak hour to make a dent in addressing user needs. After reading this article, if you still think Web 1.0 moderation is acceptable you might want to consider getting your fur trimmed and your teeth sharpened. “Just sayin’…”
Online Gaming Industry Projected Revenue
Now a $52 billion a year industry, online gaming is expected to reach $86 billion by 2014. Young people are the largest online demographic in online gaming with a whopping 98% of children having access to a gaming console and 95% having home computer access. MMOGs and MMORPGs such as Everquest (EQ), Eve, Habbo, Xbox, Runescape, Club Penguin, Moshi Monsters and Miniclip have no less than 10K and in many cases up to 50K concurrent players during peak hours. In 2009, World of Warcraft claimed 1 million concurrent users on their English servers alone.
A 10 Second History of Web Moderation Tools
In the early days, the tools were few and primitive. There wasn’t much out there to help communicate to users individually or globally apart from e-mail, basic filtering software and Instant Messenging or System Alerts. The aforementioned methods are not effective or efficient for 1000+ reasons and I’m out of time. See my blog entitled The Not-So-Brief History of Web Moderation Tools (coming soon!).
Online Community Watch System = Web Community Management 1.0
The current moderation system used by many gaming and social sites depends upon users to report bad behavior. Statistics show that less than 17% of serious criminal abuse victims offline will report abuse. Though sites experience thousands of non-serious reports each day, they still depend upon community members or victims of serious online abuse to report the problems they experience online. This is basically an ‘online community watch system’ and not terribly effective.
Young people generally won’t report serious abuse because they don’t want to lose their online social lifeline. Also, they don’t report serious abuse because they are humiliated or feel responsible for the abuse. They report a fear not being believed or often cite “nobody will do anything” as a reason to not report serious crime and/or abuse.
Just because we don’t witness or hear about serious abuse on a site doesn’t mean it isn’t happening. And now we face the argument of “actual knowledge.” Many sites are advised not to use advanced moderation technology so they can legally say they have no “actual knowledge” of what is happening within their virtual walls. In addition to being an irresponsible business practice, the ‘head-in-sand’ system could come back to haunt the business. Luckily, the amount of online abuse and serious crime remains to be a fraction of offline serious crime and abuse. But it DOES occur and it will create problems for a social gaming site eventually. It’s a numbers game. Sooner or later, everyone’s number comes up. If the site is prepared and considers user safety their responsibility, they will fare much better when the shoe drops.
A word about law enforcement: Working closely with ALL law enforcement and cooperating fully with law enforcement is essential. Law enforcement sees and deals with cybercrime we’d prefer never to know about. It is the responsibility of every online social and/or gaming site to fully cooperate with and respect the needs of law enforcement regarding cybercrime. Burying our heads in the sand only creates victims and perpetuates fear. Though the ratio of serious perpetration online is comparatively quite small to serious crime offline, the use of available technology to take every precaution possible in keeping users safe is the responsible choice. Thankfully, our forward thinking industry continues to develop commercial community management software. And the uptake of such software among site operators is growing signficantly.
20 Years On: Community Management Software
Fast forward to 2010. Today, after nearly 20 years of online commercial web services, third party commercial vendors such as Crisp Thinking, offer sophisticated community management software. In this article, I will reference Crisp's Community Management Platform (formerly called 'NetModerator').
Community management software can make the Moderator and Game Master’s jobs at least 80% more efficient. Here’s how it works:
User-Generated Problem Reports
Within the average online gaming site, users/gamers are able to report problems via the site’s reporting tools. The user-generated report is normally sent to Moderators or Game Masters, who process the reports. Once investigated, the report may or may not require action.
Based on professional experience and interviews with 10 industry experts, it is generally agreed that 90-95% of user-generated reports range from not serious to bogus or false. Examples of “not serious” reports might include reports containing phrases such as “I lost my password!” or “ What happens when I push this button?” or “How do I change my avatar.” We’ll refer to such reports as ‘non-actionable reports.’
For comparison, let’s look at two popular gaming sites: Habbo (virtual teen gaming and social sites) and MoshiMonsters.com (kids gaming and social site). Between 2001 and 2007, the Habbo teen sites averaged about 93% non-actionable reports daily. The kids’ gaming site MoshiMonsters.com averages approximately 94% non-actionable reports each day. MoshiMonsters.com receives over 5,000 user-generated problem reports per day and on peak days (weekend days) about 10,000 reports per day. MoshiMonster numbers are used for the illustration below.
The Cost of User-Generated Problem Reports
Each problem report requires a staff member to read, assess and take some sort of action, including closing the report. Let’s assume the site has amazing, efficient proprietary admin tools and a Moderator can read and action 2 reports per minute. That’s 120 reports per hour. If we assume a busy site receives 4,000 reports per day, it would take approximately 33.3 labor hours per day to address the reports. If a site employs experienced, vetted staff members, the approximate cost to the company would be $399.60 per day, or let’s just say $400.00. This number assumes a low but legal pay rate and minimal overhead costs.
So, the cost of addressing the user-generated problem reports only (not customer service, not payment issues, etc.) would average $12,000.00 per month or $144K per year if the site remains status quo, the user base doesn’t grow and there is no need to add any staff members, ever. Of course, they won’t stay in business very long if they don’t grow. But for the sake of easy math, we’ll assume the above.
Why 'State-of-the-Art' Community Management Software is the Holy Grail
Let’s say we lease or purchase third party community management software (or perhaps we build our own). What does this shiny new software do and what do I get for my money? Community management software processes and analyzes a site’s user-generated content (in this case, text), in real time, and based on algorithms and various rules, it organizes any suspect or flagged content/text into buckets. We’ll mark these buckets “junk” “not serious” “spam” “harassment” “grooming” “phishing” “scamming” “vulgarity” “false positives.” As the purchasing client, we determine the hierarchy and set the priority levels for each bucket. So, we’d likely put “junk” and “not serious” at the bottom of our list, and “grooming” and “phishing” at the top of our list. The remaining buckets we’ll allow to fall somewhere in between.
The software prioritizes the analyzed text and in the case of forums or messages, the software flags it for our moderation team before most users would have a chance to witness and report the problem. With live chat, the software would likely flag the moderation team about the same time a user might report the problem. The software also analyzes all the user-generated reports by looking for keywords, various strings and phrases and prioritizes the user-generated reports into our buckets. It even recognizes duplicated reports. And it does all of the above in a fraction of the time of a highly experienced human moderator – and it does it more accurately, without emotion, fatigue or frustration. The 90+% of non-actionable reports are in a “rainy day” bucket. The 3-10% potentially actionable reports are flagged and prioritized so the moderation team can deal with them quickly and effectively. We just saved a company a nice sum of labor costs and likely caught serious problems that users wouldn’t report for fear of embarrassment, or being bullied, branded as a tattle tail, or in many cases they just wouldn’t bother with. In fact, the player may have left and never come back and we’d never know why. But now we know and now we can address the issue and our good netizens know we can and will address problem behavior. It’s a win-win. The site’s reputation with their users improves, the problem users go away because they know they’re being monitored. The company saves money while becoming more efficient and effective. Then we ask the CEO for a nice 60% pay raise since we’re saving the company so much money. Piece of cake!
You’ll have to do your own math to determine when it pays to spend money on a community management platform. And we can only hope that the industry will meet online gaming and community needs via additional third party vendors introducing competitive products, which will help the industry to become financially competitive, which not only insures the products consistently improve but makes them more affordable.
Here’s a real world example of the potential costs savings: At Cartoon Network, the virtual world Fusion Fall experienced a 95% cost savings over night when implementing Crisp’s Community Management Platform. Automatic Behavior Management tools take community management software to the next level by generating auto warns, auto bans, auto blocks, etc. to users who pass certain thresholds after X number of infractions (customizable and determined by the client). The system escalates from a basic warning to a permanent ban. GMs and Moderators will spend some time checking a % of the action records to be sure there are no false positives, but they no longer have to manually warn, ban or block a user from the game and count user infractions with arcane tools or worse, pencil and paper.
How Many Moderators Do I Need?
Habbo currently has approximately 220 outsourced Moderators to meet the needs of their 11 language versions of Habbo and 15+ million unique visitors per month. Club Penguin employs “about 300 Customer Service team members and in-game Moderators” to take care of 6 million unique visitors per month. MoshiMonsters.com has over 29 million registered users and employs a total of 15 part-time contractors and 8 full-time community staff members who cover normal game Moderation, all aspects of Content and Customer Service for membership and game play, and blog and forum moderation for approximately 2 million unique visitors per month (visitors to blogs and forums, that is). The latest stats show that Moshi receives one new registered user every second.
One of the largest global gaming companies online (which has asked to remain anonymous) reports 50-100K concurrent users during peak hours and employs a total of 43 moderators (internal and outsourced). This company receives about 1.5 million unique visitors per month.
Tamara Littleton, CEO and founder of eModeration.com, a moderation and community management outsourcing company, works with several well-known global children’s gaming sites and says her company averages about 2-3 moderators per site, during peak hours. One can’t help but notice that the anonymous company mentioned above, Moshi Monsters and a fair number of eModeration.com clients use community management software (cms) and employ less than 1/4th the number of moderation and customer service staff as the gaming companies that don’t use UMS.
To be fair to Habbo, micro-transactions as a business model will always cost more in Customer Service than subscriptions and Habbo operates 11 language versions of their site. Also, according to summer 2010 Comscore stats, Habbo hotels collectively receive over 2 times the unique visitors Club Penguin receives and 7 times the unique visitors the other top gaming companies receive.
The bottom line is, companies that use Web 1.0 moderation methods - the method requiring staff members to sift through 90+% non-actionable reports to get to the valid or actionable reports – will always spend between 70 to 90% more than is necessary, if the company averages over 500K unique visitors per month.
The Moderation Wish List
The social gaming community professionals interviewed for this article all had their moderation wish lists ready. I have permission to share some of the names with you, so here goes:
Emma Monks of Sulake.com (Habbo) said she would like to find “a way to permanently exclude the minority of troublemakers so they don't end up generating the most work.” And an anonymous professional employed by one of the world’s largest online gaming companies shared a prudent observation. This professional was discussing the over-reaction to copyright issues and said “…There is no precedent for a user being sued for having the name ‘Luke Skywalker,’ so let's all relax and allow each other to use [the name]. If someone goes on to make money, fine, sue them.” The same person added another wish to their list regarding parent accountability. “I’d like to see a law putting more emphasis on parental control - more support for children would also be helpful. Parents should take more time in the monitoring and education of and participation with their children online. There's only so much any online game or website can do.”
Accountability for behavior online in gaming communities was a common theme among the 10 professionals interviewed. The Community Manager for a major worldwide entertainment company said they would like to see “some method of accountability, it is too easy to do whatever you want, whenever you want, to whomever you want with little comeback.”
We’ll Always Need Humans
As appealing and efficient as good technology is, the accountability discussion supports the fact that we will always need humans. Thankfully, a growing number of community professionals agree. Technology is essential…
No need to send me hate mail, GMs and Moderators. I’m in your corner, really I am. It’s just that we need to evolve or we’d all still be living in caves making animal sounds or drawing on walls for entertainment. As Tamara Littleton of eModeration.com said to me once, “We wouldn’t really be advancing in the industry if we didn’t develop technology to improve our line of work, would we?” That’s the spirit, Tamara!
In fact, with the introduction of community management tools, human professional GMs or Moderators are free to investigate and action potentially serious issues and free to build customer loyalty by engaging with and recognizing good customers. They can now spend their time with the contributors, creating and building the community. They can and should be actively involved in providing marketing and branding teams with customer feedback, gaming trends, and the like. Community people, GMs, Moderators – they are the POS people on the front line. They should have the time to focus on good customers who are good netizens but are often ignored because staff members are busy dealing with ill-mannered trolls, flamers, phishers, scammers… and other problem users who generate the majority of the trouble (and usually don’t spend a dime) on your site.
Human professional GMs and Moderators are also invaluable with respect to evaluating the context of user-generated content. Yet to be programmed in the future no doubt, we need humans for context and judgment as it relates to emotional content. The cms can make us more efficient and effective, but it can’t yet (if ever) replace the value of human judgment. For example, cms isn’t sophisticated enough to know that a gamer may be discussing how s/he was phished, scammed or harassed – they may simply be repeating the words used by someone else who attempted to scam or harass them. As of today, complete context can only be determined by human judgment. But who knows what the future of moderation and online gaming will bring?
If science fiction is any indication of what’s to come, our ‘potential’ behavior will be monitored by three floating “pre-cogs” and Tom Cruise will come crashing through our window to send us off to the land of the frozen. In the mean time, we can take advantage of technological advances for an antiquated system.
“There's nothing wrong with the system, it is perfect.”
“…perfect. I agree. But if there's a flaw, it's human. It always is.” – Minority Report
Rebecca Newton is the Chief Community & Safety Officer for Mind Candy Ltd. She also serves as the Lead Safety Advisor to CrispThinking.com. Rebecca is an internationally recognised expert in Child Safety and Online Community. She speaks about risk management, Social Networking, the web and Child Safety around the globe. TWITTER: RebeccaNewton LINKEDIN: linkedin.com/in/rebeccanewton
Read more about:
Featured BlogsYou May Also Like