San Mateo-based Roblox and Discord sued over girl’s sexual, financial exploitation



By Joel Rosenblatt and Cecilia D’Anastasio | Bloomberg

Roblox Corp. and Discord Inc. are the latest targets in a wave of lawsuits over social media addiction, in a case alleging an 11-year-old girl was exploited by a sexual predator while playing online video games.

The girl became acquainted with adult men through Roblox and Discord’s direct messaging services, which she thought had safeguards protecting her, according to a statement by her lawyers at the Seattle-based Social Media Victims Law Center who have brought numerous other addiction cases.

Wednesday’s complaint in state court in San Francisco also blames Snap Inc. and Meta Platforms Inc. for the girl’s mental-health difficulties and suicide attempts.

“These men sexually and financially exploited her,” the group said. “They also introduced her to the social media platforms Instagram and Snapchat, to which she became addicted.”

A Discord spokesperson declined to comment on pending litigation, but said the company “has a zero-tolerance policy for anyone who endangers or sexualizes children.”

“We work relentlessly to keep this activity off our service and take immediate action when we become aware of it,”’ the company said in a statement, adding that it uses a technology called PhotoDNA to find and remove images of child exploitation and engages with government authorities where appropriate.

Meta declined to comment on the suit. Roblox and Snap didn’t immediately respond to requests for comment.

Meta and Snap have previously said they’re working to protect their youngest users, including by offering resources on mental health topics and improving safeguards to stop the spread of harmful content.

More than 80 lawsuits have been filed this year against Meta, Snap, ByteDance Inc.’s TikTok, and Alphabet Inc.’s Google centering on claims from adolescents and young adults that they’ve suffered anxiety, depression, eating disorders, and sleeplessness after getting hooked on social media. In at least seven cases, the plaintiffs are the parents of children who’ve died by suicide.

Discord is a gaming chat app that has 150 million monthly active users. Popular with young people, Discord was known as a sort of wild-west space online. The company beefed up its moderation efforts over the past two years. In 2022, at least a half-dozen cases involving child sex abuse material or grooming children cited Discord, according to a Bloomberg News search of Justice Department records.

Roblox is a gaming platform with over 203 million monthly active users, many of whom are children. Young players have been introduced to extremists on the platform, who may take conversations elsewhere online like Discord or Skype. Roblox has robust moderation efforts, which include scanning text chat for inappropriate words as well as every virtual image uploaded to the game.

The girl in the lawsuit, who’s identified only by the initials S.U., and her family are seeking to hold the social media companies financially responsible for the harms they allegedly caused. The family also wants a court order directing the platforms to make their products safer, which the Social Media Victims Law Center said can be done through existing technologies and at minimal time and expense for the companies.

S.U. said soon after she got an iPad for Christmas at age 10, a man named Charles “befriended” her on Roblox and encouraged her to drink alcohol and take prescription drugs.

Later, encouraged by the men she met on Roblox and Discord, S.U. opened Instagram and Snapchat accounts, initially hiding them from her mother, according to the complaint.

While she wasn’t yet 13 — the minimum age for accounts on Instagram and Snap under their terms of service — S.U. became addicted to the platforms to the point that she would sneak online in the middle of the night, leading her to become sleep-deprived, according to the complaint.



Source link

Denial of responsibility! galaxyconcerns is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave A Reply

Your email address will not be published.