Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Reporting results from its Congress-tasked study on children and online games, the U.S. Federal Trade Commission says it found objectionable material in 70 percent of the virtual worlds and MMOs it investigated.
Reporting results from its congressionally mandated study on children possibly accessing explicit content in virtual worlds, the Federal Trade Commission says it found objectionable material in 70 percent of the online worlds it investigated. Congress tasked the FTC with examining the matter in May of this year, prompted by concerns over minors easily accessing explicit content in virtual worlds. As part of the study, the commission scrutinized 27 popular virtual worlds and MMOs (e.g. Second Life, YoVille, Neopets, Habbo, Dofus, Runescape) as a cross section of worlds targeting children, teens, and adults. Researchers for the agency registered in each world as adults, teens, and children, recorded their actions and content they came across in the virtual environment, then identified their level of explicit content (based on factors measured against a subjective metric established by the FTC) as either heavy, moderate, or low. The Commission says it found at least one instance of either sexually or violently explicit content in 19 of the 27 virtual worlds it investigated. Five of those exhibited a heavy amount of explicit content, four presented a moderate amount, and ten displayed a low amount. Of the 14 children-oriented worlds observed, seven contained no explicit content, six demonstrated a low amount, and only one showed a heavy amount. Notably, almost all of the explicit content that was found in the child-oriented worlds were only accessible when the researchers were registered as teens or adults, and not as children. Most of the offending material found in those instances was text-based and found in chat rooms, message boards, and discussion forums. The FTC observed that for teen- and adult-targeted worlds, 12 of the 13 worlds it studied for that demographic contained explicit content -- five of which had a heavy amount, three containing a moderate amount, and four showing a small amount. Half of the explicit content in the teen/adult virtual worlds was text-based, with still graphics, moving graphics, and audio making up the other half. The agency noted that most of the eight teen- and adult-oriented online worlds that contained at least a moderate amount of explicit content featured age-screening mechanisms to prevent minors from accessing their worlds, and five prevented immediate attempts to re-register from the same computer with an older age after the system rejected the minor moments prior. Three worlds also featured adult-only sections segregating younger users from accessing age-inappropriate content. Those eight teen/adult virtual worlds with moderate to heavy explicit material were also reviewed for conduct standards, and the commission found that while most of the worlds restricted certain kinds of sexual, threatening, or abusive content, they used vague terms that provided inadequate guidance to users about what specifically was prohibited. The FTC comments that these rules of conduct are "insufficient to stem the creation of or exposure to explicit material. " The Commission also said that while a portion of virtual worlds use language filters to prevent objectionable language from appearing in text communications, their success is limited. Of the three teen- and adult-oriented online environments with these filters in place, the commission found a heave amount of explicit text in one, a moderate amount in one, and a low amount in the remaining world. After presenting these findings in its full report [29mb PDF], the FTC offered several recommendations for virtual world operators to reduce the risk of youth exposure to explicit content, such as ensuring that age-screening mechanisms do not encourage underage registrations, implementing or strengthening age-segregation techniques that ensure minors and adults interact only with their peers and view only age-appropriate content, and re-examine the strength of language filters. It also suggested employing specially trained moderators to take action against conduct violations, and providing greater guidance to community enforcers so that they can better review, rate, and report content, as well as comment on users who are violating the world's terms of behavior. Recognizing that the responsibility of preventing children from accessing explicit content in these online communities doesn't lie only with virtual world operators, the Commission recommends that parents and their children alike should "become better educated about the benefits and risks of youth participation in online virtual worlds".
Read more about:
2009You May Also Like