- Share this article on Facebook
- Share this article on Twitter
- Share this article on Flipboard
- Share this article on Email
- Show additional share options
- Share this article on Linkedin
- Share this article on Pinit
- Share this article on Reddit
- Share this article on Tumblr
- Share this article on Whatsapp
- Share this article on Print
- Share this article on Comment
Meta, TikTok and Snap were each hit with a new lawsuit accusing them of fueling mental health disorders in teenage users. The plaintiffs are among a wave of parents and their children that are taking social media platforms to court arguing that the companies not only hook users but do so knowing the harms they pose.
The lawsuits — the latest in a string of cases linking social media to mental health problems in minors — assert product liability claims to get around Section 230 of the Communications Decency Act, a federal law shielding tech companies from liability arising from content produced by third parties. They advance a theory arguing that platforms like Facebook are essentially defective products that lead to injuries, including eating disorders, anxiety and suicide. At least 20 such lawsuits have been filed across the country citing the Facebook Papers, a trove of internal company documents leaked by whistleblower Frances Haugen last year, with dozens more expected to come.
Related Stories
“This is the business model utilized by all Defendants — engagement and growth over user safety — as evidenced by the inherently dangerous design and operation of their social media products,” states one of the complaints filed on Thursday in Los Angeles Superior Court. “At any point any of these Defendants could have come forward and shared this information with the public, but they knew that doing so would have given their competitors an advantage and/or would have meant wholesale changes to their products and trajectory. Defendants chose to continue causing harm and concealed the truth instead.”
Plaintiffs take aim at the platforms’ product features. They allege that the companies’ algorithms amplify dangerous content that prioritizes engagement over safety.
By steering clear of claims centering on the specific content that the platforms host, they sidestep potential immunity flowing from Section 230. The law has historically afforded tech companies significant legal protection from liability as third-party publishers. A major ruling concerning the law, which may be reformed, was delivered last year when a federal appeals court found that Snap can’t invoke Section 230 to protect itself from a lawsuit claiming that the company’s design of a speedometer function contributed to a fatal crash by encouraging speeding. “Plaintiff’s claims do not arise from third party content, but rather, Defendants’ product features and designs, including but not limited to algorithms and other product features that addict minor users, amplify and promote harmful social comparison, [and] affirmatively select and promote harmful content to vulnerable users based on their individualized demographic data and social media activity,” the complaint states.
The lawsuits claim that the lack of parental controls are features and not bugs of the platforms. The minimum age to join TikTok is 13. Yet, the company reported in 2020 that over a third of its 49 million daily users were 14 or younger, according to the lawsuit. The plaintiffs allege that the platforms intentionally don’t verify or check email account authenticity.
Another allegation is that the platforms know that teens are opening multiple accounts in violation of the terms of service but allow them to do so to drive growth. Snapchat‘s failure to enforce its one account rule promoted bullying among users, the lawsuit says.
“Each of Defendant’s products are designed in a manner intended to and that do prevent parents from exercising their right to protect and oversee the health and welfare of their child,” the complaint states. “Defendants’ products are meant to enable children to evade parental controls.”
The complaints allege strict liability, negligence, unjust enrichment and invasion of privacy. One of the plaintiffs, the mother of a child who committed suicide, is also pursuing a claim for intentional infliction of emotional distress against TikTok.
The lawsuits were filed by the Social Media Victims Law Center, which is representing numerous other plaintiffs in identical suits across the country.
Facebook, TikTok and Snap didn’t immediately respond to requests for comment.
THR Newsletters
Sign up for THR news straight to your inbox every day