Roblox and Discord are the latest targets in a wave of lawsuits over social media addiction, in a case alleging an 11-year-old girl was exploited by a sexual predator while playing online video games.
The girl became acquainted with adult men through Roblox and Discord’s direct messaging services, which she thought had safeguards protecting her, according to a statement by her lawyers at the Seattle-based Social Media Victims Law Center who have brought numerous other addiction cases.
Wednesday’s complaint in state court in San Francisco also blames Snap and Meta for the girl’s mental health difficulties and suicide attempts.
“These men sexually and financially exploited her,” the group said. “They also introduced her to the social media platforms Instagram and Snapchat, to which she became addicted.”
Meta declined to comment on the suit. Roblox, Discord and Snap didn’t immediately respond to requests for comment.
Meta and Snap have previously said they’re working to protect their youngest users, including by offering resources on mental health topics and improving safeguards to stop the spread of harmful content.
More than 80 lawsuits have been filed this year against Meta, Snap, ByteDance Inc.’s TikTok, and Google centering on claims from adolescents and young adults that they’ve suffered anxiety, depression, eating disorders and sleeplessness after getting hooked on social media. In at least seven cases, the plaintiffs are the parents of children who’ve died by suicide.
Discord is a gaming chat app that has 150 million monthly active users. Popular with young people, Discord was known as a sort of wild-west space online. The company beefed up its moderation efforts over the past two years. In 2022, at least a half-dozen cases involving child sex abuse material or grooming children cited Discord, according to a Bloomberg search of Justice Department records.
Roblox is a gaming platform with over 203 million monthly active users, many of whom are children. Young players have been introduced to extremists on the platform, who may take conversations elsewhere online like Discord or Skype. Roblox has robust moderation efforts, which include scanning text chat for inappropriate words as well as every virtual image uploaded to the game.
The girl in the lawsuit, who’s identified only by the initials S.U., and her family are seeking to hold the social media companies financially responsible for the harms they allegedly caused. The family also wants a court order directing the platforms to make their products safer, which the Social Media Victims Law Center said can be done through existing technologies and at minimal time and expense for the companies.
S.U. said soon after she got an iPad for Christmas at age 10, a man named Charles “befriended” her on Roblox and encouraged her to drink alcohol and take prescription drugs.
Later, encouraged by the men she met on Roblox and Discord, S.U. opened Instagram and Snapchat accounts, initially hiding them from her mother, according to the complaint.
While she wasn’t yet 13 — the minimum age for accounts on Instagram and Snap under their terms of service — S.U. became addicted to the platforms to the point that she would sneak online in the middle of the night, leading her to become sleep-deprived, according to the complaint.
In 2020, S.U. says she fell victim to a Roblox user named Matthew, a 22-year-old from Missouri who convinced her to send sexually explicit images, which he allegedly sold online.
S.U. relied heavily on Snapchat’s “My Eyes Only” feature to hide what was happening from her mother, who continued to monitor S.U.’s social media use but didn’t know about My Eyes Only, according to the complaint.
She tried to take her own life in July 2020 and again in August 2020, and her parents went more than $10,000 into debt in 2021 from expenses related to her mental health crises, according to the complaint.
The case is C.U. and S.U. v. Meta Platforms Inc., California Superior Court, San Francisco County.