Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
The growing popularity of Discord has made it easy for the platform to be used by predators to exploit its younger userbase.
Content note: This story contains discussions of children and teens being groomed, abducted, and sexually exploited.
Discord reportedly has a troubling history of predators using it as a means to sexually exploit young users. A new report from NBC News shows in the last several years, the popular chat platform was cited in 35 cases against prosecuted adults who communicated with minors prior to their convictions.
Such material is explicitly forbidden, per Discord's rules, and illegal nearly everywhere else in the world. But as revealed by the National Center for Missing & Exploited Children (NCMEC), reports of child sexual exploitation material (CSAM) on Discord jumped up by 474 percent between 2021 and 2022.
As NBC notes, Discord isn't the first platform to have to handle issues with online child exploitation. But its issues are made greater by the fact that it's become so enmeshed in video games through native apps on consoles and developers moving their communities there.
There've reportedly been an extra 165 cases of adults being prosecuted for sending (or receiving) CSAM via Discord. Of those prosecutions, 91 percent have been convicted.
John Shehan, NCMEC's senior VP, acknowledged there's been an "explosive growth" of CSAM on Discord. He also called it "undeniable. There is a child exploitation issue on the platform."
As further evidence of his point, Shehan claims NCMEC has "frequently" gotten reports from other online platforms where users specifically call Discord a place for CSAM.
From NBC's reporting, Discord's CSAM issue is in part thanks to how it lets users of all ages mix together in community servers. Further, there's no way to identify a user's self-reported age, a common issue with online platforms overall.
By its own admission, Discord has stated that it does not "monitor every server or every conversation. [...] When we are alerted to an issue, we investigate the behavior at issue and take action."
According to NCMEC, part of the reason for that growth is how Discord's response time to such material has changed. In 2021, responding to complaints took three days (on average) and in 2022, that time has extended to five. In other instances, it's possible that Discord failed to respond to complaints at all.
That said, Discord's transparency report for 2022 shows it disabled over 37,000 accounts on the basis of child safety violations. And when it does work with law enforcement on complaints of CSAM, it reportedly provides material such as IP addresses, message logs, and account names.
Going forward, Discord said it plans to invest in "age assurance technologies" to diminish the amount of exploitative material on the platform. But in the eyes of watchdog groups and law enforcement agencies, the platform should have sorted out these problems long ago.
Denton Howard, an executive director for Inhope, a hotline for exploited and missing children, plainly stated that "safety by design should be included from Day One. Not from Day 100."
NBC News' full report can be read here. The piece covers additional topics such as how adults lure young Discord users to their servers, other organizations that are trying to diminish the amount of CSAM in the world, and Discord's relationships with those various groups.
Game Developer has reached out to Discord for a comment on NBC News' report, and will update when a response is given.
You May Also Like