Big Tech’s Favorite Legal Shield Takes a Hit

A Los Angeles judge found that Meta, Snap and TikTok can't wield Section 230 to escape claims. Similar lawsuits against tobacco and opioid manufacturers have led to bankruptcies and billion-dollar payouts.

A Los Angeles judge has declined to dismiss a series of blockbuster lawsuits against Meta, TikTok, Snap and Google arguing their platforms are intentionally designed to addict and fuel mental health disorders in teenagers, increasing the likelihood they will have to potentially face or settle for billions of dollars the product liability claims.

In the first order advancing litigation raising a novel public nuisance theory from hundreds of government officials and parents of minors, Los Angeles Superior Court Judge Carolyn Kuhl on Friday found that the companies can’t wield Section 230 — Big Tech’s favorite legal shield — to escape some claims in the case. She nodded to “the fact that the design features of the platforms — and not the specific content viewed” by users caused their injuries.

Related Stories

Thousands of plaintiffs across the country have sued social media companies, arguing their platforms are essentially defective products that lead to eating disorders, anxiety and suicide, among other mental health injuries. The lawsuits could lead to multibillion-dollar payouts, with similar public nuisance lawsuits from government officials in lawsuits against opioid and tobacco manufacturers having resulted in massive settlements. By steering clear of claims centering on the specific content that companies host, they’re trying to sidestep potential immunity flowing from Section 230, which has historically afforded tech firms significant legal protection from liability as third-party publishers.

Advancing a claim for negligence, Kuhl found that defendants can’t invoke the law to dismiss allegations revolving around allegedly defective design features since they don’t concern third-party content. She pointed to a federal appeals court last year undercutting application of Section 230 by concluding that Snap could be liable for a lawsuit claiming that the company’s design of a speedometer function contributed to a fatal crash by encouraging speeding.

“The features themselves allegedly operate to addict and harm minor users of the platforms regardless of the particular third-party content viewed by the minor user,” stated the ruling, which cited TikTok’s continuous scrolling feature and the inability to disable autoplay.

Other potentially problematic product features include lenses and filters, which have allegedly promoted body image issues among teenagers, and the lack of parental controls allegedly designed to encourage minors to create secret accounts to mask their usage. Kuhl said that Section 230 “does not provide immunity” when a “provider manipulates third party content in a manner that injures a user.”

According to the order, a jury will decide whether users’ addiction to the platforms were caused by third-party content or the apps’ design features.

Additionally, Kuhl declined to dismiss allegations that Meta may have fraudulently concealed internal research demonstrating the negative impact Instagram can have on minors’ mental health — including data showing that “high time spent users” are disproportionately young and reports that teenagers attribute Instagram as a source of increased anxiety and depression. Parents argued that they wouldn’t have let their children use the platform had they known.

“Meta is not protected from tort liability for its own failure to warn because these adverse effects that allegedly should have been disclosed result from Meta’s own conduct, not from any particular content displayed,” wrote Kuhl, who noted that Meta may have had a duty to warn of potential harms as a creator of features designed to maximize engagement for minors.

In a loss for plaintiffs, product liability claims were dismissed since such claims are typically reserved for “tangible products” that are mass manufactured and marketed.

Under the judge’s ruling, claims for strict liability, design negligence and negligent undertaking, among others, were dismissed. Plaintiffs were given the chance to fix the allegations.

In a statement, a Google spokesperson said the “allegations in these complaints are simply not true.” He added, “Protecting kids across our platforms has always been core to our work. In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.”

Snap declined to comment. Meta and Snap didn’t respond to requests for comment.

Plaintiffs’ attorney Brian Panish said in a statement that “this decision is an important step forward for the thousands of families we represent whose children have been permanently afflicted with debilitating mental health issues thanks to these social media giants.” He stressed that “tech companies like Meta, Snap Inc., ByteDance, and Google are immensely powerful and have little in the way of industry-specific regulation to keep them in check.”