Judge Rules Social Media Companies Must Face Child Addiction Lawsuits

lawsuit social media companies addiction

A judge has ruled that social media companies — including Meta, Google, Snap, and ByteDance — must face the hundreds of lawsuits blaming them for children’s addiction to its platforms.

On Tuesday, U.S. District Judge Yvonne Gonzalez Rogers rejected social media giants’ motion to dismiss the numerous lawsuits accusing the companies of illegally enticing and then addicting millions of children to their platforms.

According to Reuters, Judge Rogers ruled that Google (which owns YouTube), Meta (which runs Facebook and Instagram), ByteDance (which owns TikTok), and Snap (which operates Snapchat), must face the lawsuits filed against them over the last few years.

The decision covers hundreds of lawsuits filed on behalf of individual children who allegedly suffered negative physical, mental, and emotional health effects from social media use including anxiety, depression, and occasionally suicide.

Last month, 42 states sued Meta for allegedly harming young people’s mental health by knowingly designing features on Instagram and Facebook to hook children to its platforms.

Meanwhile, numerous school districts across the U.S. have filed lawsuits against social media companies in recent months, also accusing them of harming young people’s mental health.

‘A Significant Victory’

“Today’s decision is a significant victory for the families that have been harmed by the dangers of social media,” the plaintiffs’ lead lawyers — Lexi Hazam, Previn Warren, and Chris Seeger – say in a joint statement regarding the ruling.

In her ruling, Judge Rogers states that the First Amendment and Section 230, which says online platforms shouldn’t be treated as the publishers of third-party content, does not shield Facebook, Instagram, YouTube, TikTok, and Snapchat from all liability in this case.

According to The Verge, Judge Rogers noted many of the claims laid out by the plaintiffs don’t “constitute free speech or expression,” as they have to do with alleged “defects” on the platforms themselves.

That includes having insufficient parental controls, no “robust” age verification systems, and a difficult account deletion process.

“Addressing these defects would not require that defendants change how or what speech they disseminate,” Judge Rogers writes.

“For example, parental notifications could plausibly empower parents to limit their children’s access to the platform or discuss platform use with them.”

A Google spokesperson tells The Verge that the allegations in these complaints are “simply not true,” adding the company has “built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.”

A TikTok spokesperson similarly tells Reuters it has “robust safety policies and parental controls.” While Snap and Meta did not respond to the news outlet’s request for comment.


Image credits: Header photo licensed via Depositphotos.

Discussion