When Kate’s 13-year-old son took up Minecraft and Fortnite, she did not worry. He played in a room where she could keep an eye on him.
But about six weeks later, Kate saw something appalling pop up on the screen: a video of bestiality involving a young boy. Horrified, she scrolled through her son’s account on Discord, a platform where gamers can chat while playing. The conversations were filled with graphic language and imagery of sexual acts posted by others, she said.
Her son broke into tears when she questioned him last month.
“I think it’s a huge weight off them for somebody to step in and say, ‘Actually this is child abuse, and you’re being abused, and you’re a victim here,’” said Kate, who asked not to be identified by her full name to protect her family’s privacy.
Sexual predators have found an easy access point into the lives of young people: They are meeting them online through multiplayer video games and chat apps, making virtual connections right in their victims’ homes.
Criminals strike up a conversation and gradually build trust. Often they pose as children. Their goal, typically, is to dupe children into sharing sexually explicit photos and videos of themselves — which they use as blackmail for more imagery, much of it increasingly graphic and violent.
Reports of abuse are emerging with unprecedented frequency around the country, with some perpetrators grooming hundreds and even thousands of victims, according to a review of prosecutions, court records, law enforcement reports and academic studies.
The New York Times reported earlier this year that the tech industry had made only tepid efforts to combat an explosion of child sexual abuse imagery on the internet. The Times has also found that the troubled response extends to the online gaming and chat worlds.
There are tools to detect previously identified abuse content, but scanning for new images — like those extorted in real time from young gamers — is more difficult. While a handful of products have detection systems in place, there is little incentive under the law to tackle the problem, as companies are largely not held responsible for illegal content posted on their websites.
Six years ago, a little more than 50 reports of the crimes, commonly known as “sextortion,” were referred to the federally designated clearinghouse in suburban Washington that tracks online child sexual abuse. Last year, the center received more than 1,500. And authorities believe that the vast majority of sextortion cases are never reported.
There has been some success in catching perpetrators. In May, a California man was sentenced to 14 years in prison for coercing an 11-year-old girl “into producing child pornography” after meeting her through the online game Clash of Clans. An Illinois man received a 15-year sentence in 2017 after threatening to rape two boys in Massachusetts — adding that he would kill one of them — whom he had met over Xbox Live.
“The first threat is, ‘If you don’t do it, I’m going to post on social media, and by the way, I’ve got a list of your family members, and I’m going to send it all to them,’” said Matt Wright, a special agent with the Department of Homeland Security. “If they don’t send another picture, they’ll say: ‘Here’s your address — I know where you live. I’m going to come kill your family.’”
The trauma can be overwhelming for the young victims. An FBI study reviewing a sample of sextortion cases found that more than one-quarter of them led to suicide or attempted suicide.
It makes sense the gaming world is where many predators would go. Almost every teenage boy in America — 97% — plays video games, while about 83% of girls do, according to the Pew Research Center.
In many states, gaming counts as a team sport. Colleges offer scholarships to elite gamers, and cities are racing to establish professional teams. The industry is enormously profitable, generating over $43 billion in revenue last year in the United States.
There are many ways for gamers to meet online. They can use built-in chat features on consoles like Xbox and services like Steam or connect on sites like Discord and Twitch. The games have become extremely social, and developing relationships with strangers on them is normal.
“These virtual spaces are essentially hunting grounds,” said Mary Anne Franks, a professor at the University of Miami School of Law and president of the Cyber Civil Rights Initiative, a nonprofit group dedicated to combating online abuse.
This fall, the FBI rolled out an awareness campaign in middle and high schools to encourage children to seek help when caught in an exploitive sexual situation. “Even if you accepted money or a game credit or something else, you are not the one who is in trouble,” material from the campaign explains.
Catching the Predators
New Jersey police departments were flooded with phone calls from parents and teachers alarmed about pedophiles lurking on game sites and in chat rooms. So law enforcement officials from across the state took over a building near the Jersey Shore last year and started chatting under assumed identities as children.
In less than a week, they arrested 24 people.
Authorities did it again, this time in Bergen County, a suburb close to New York City. They made 17 arrests. And they did it once more, in Somerset County, arresting 19. One defendant was sentenced to prison, while the other cases are still being prosecuted.
After the sting, officials hoped to uncover a pattern that could help in future investigations. But they found none; those arrested came from all walks of life.
When announcing the arrests, authorities highlighted Fortnite, Minecraft and Roblox as platforms where offenders began conversations before moving to chat apps. Nearly all those arrested had made arrangements to meet in person.
In a separate case in Ohio, the digital abuse of a young boy led to his physical abuse. The offender, Jason Gmoser, would encourage boys to show their genitals while on PlayStation, according to court records. Gmoser told detectives in 2014 that he spent years interacting with an 8-year-old, traveling to Missouri to visit the boy and his family, showering them with gifts and paying some of their bills. On at least one trip, he said, he sexually abused the child. He is now serving a life sentence in a separate case, for running a child sexual abuse site on the dark web.
In 2018, three men from across the country were convicted of running a sextortion ring for years that lured hundreds of children from social and video streaming platforms.
In another case, a girl attending high school in Tennessee thought she had made a new female friend on Kik Messenger. They chatted for six months. After the teenager shared a partially nude photo of herself, the “friend” became threatening and demanded that she record herself performing explicit acts. The girl told her mother, who called police.
The photo was never shared publicly, but she said in an interview with the Times that she was haunted by the experience years later.
“I thought for a long time that there was something wrong with me or that I was a bad person,” she said. “Now that I’ve gotten to college, I’ll talk to my friends about it, and there have been so many girls who have said, ‘That exact same thing happened to me.’”
An Industry Without Answers
There are a few seemingly simple protections against online predators, but logistics, gaming culture and financial concerns present obstacles.
Companies could require identification and parental approvals to ensure games are played by people of the same age. But many gamers have resisted giving up anonymity.
Microsoft, which owns Xbox and the popular game Minecraft, said it planned to release free software early next year that could recognize some forms of grooming and sextortion.
Marc-Antoine Durand, chief operating officer of Yubo, a video chat app based in France that is popular among teenagers, said it watches for grooming behavior with software from Two Hat Security, a Canadian firm. Yubo also disables accounts when it finds an age discrepancy, often requiring users to provide a government-issued ID. But users frequently object to providing documentation, and many children do not possess it.
A Facebook spokeswoman said it had made a variety of efforts to separate adults from children, including limiting how adults can message and connect with them. The Times was able to find minors by looking at users’ lists of friends and activities on pages popular with children.
Instagram, owned by Facebook, allowed users to send private messages to anyone, and a Times reporter was able to contact and video chat with a 13-year-old girl who had a private account (the girl and her parents gave permission to conduct the test). After the Times asked about its messaging policy, Instagram announced new features Wednesday that allow users to block messages from people they do not follow. The company said it would also require users to enter their age when signing up.
Other companies have taken a more hands-off approach, citing privacy concerns.
Discord said it scanned shared images for known illegal material and that moderators reviewed chats considered a high risk. The company does not, however, automatically monitor conversations for grooming, suggesting it would be unreliable and a privacy violation.
Some of the biggest gaming companies provided few, if any, details about their practices. Epic Games, creator of Fortnite, which has roughly 250 million users, did not respond to multiple messages seeking comment.
Sony, the maker of PlayStation, which had nearly 100 million monthly active users earlier this year, pointed to its tutorials on parental controls and tools that let users report abusive behavior.
The solution many game developers and online safety experts return to is that parents need to know what their children are playing and that children need to know what tools are available to them. Sometimes that means blocking users and shutting off chat functions, and sometimes it means monitoring the games as they are being played.
But Kristy Custer, principal at Complete High School Maize in Kansas, who has helped design the curriculum used by many high school esports teams, said parents should react carefully when their children report encounters with online predators. Punishing children — no more video games or social media, for example — could backfire by pushing them into even more dangerous places for their online activity.
“You just did exactly what that predator wanted them to do — and drove them into the darker space,” she said.