The Section 230 shield just got a hole punched through it.

A Los Angeles jury found Meta and Google liable Wednesday for building addictive products that caused a young woman’s mental health problems. According to NPR, the jurors ordered the companies to pay $3 million in compensatory damages and an additional $3 million in punitive damages — $6 million total, with Meta responsible for 70% and Google for 30%.

The verdict doesn’t meaningfully hurt two companies that make billions each quarter. But it creates a legal blueprint for sidestepping the federal law that has protected online platforms from liability for three decades.

Design, Not Content

For years, tech companies have hidden behind Section 230 of the Communications Decency Act, a 1996 law stating that platforms aren’t responsible for user-generated content. Someone posts something harmful? The platform is just the messenger.

The Los Angeles legal team took a different approach. They didn’t sue over what users posted. They sued over how the platforms were built.

Their argument: features like infinite scroll, autoplay, algorithmic recommendations, and push notifications aren’t content — they’re product design. And defective product design has always been sue-able.

“How do you make a child never put down the phone? That’s called the engineering of addiction,” said Mark Lanier, one of the plaintiff’s attorneys, during the trial.

Judge Carolyn Kuhl agreed that design features were fair game. The jury found both companies negligent in how they built their apps and failed to warn users about potential harms.

The Plaintiff

The case centered on Kaley, a 20-year-old from Chico, California, identified in court documents by her initials KGM. She testified that she started using YouTube at age 6 and Instagram at 11.

She described running to the school bathroom to check her likes, comparing herself obsessively to filtered images, spiraling into depression, anxiety, and body dysmorphia. She said she craved social media validation so intensely that she couldn’t concentrate in school.

The jury wasn’t asked to decide whether social media caused all of Kaley’s problems — only whether her compulsive use was a “substantial factor” in her struggles and whether defective platform design directly caused her distress. They said yes to both.

Meta and Google fought back hard. They emphasized Kaley’s difficult childhood, including emotional and physical abuse documented in medical records. They noted that her own therapists had rarely identified social media as a factor in her treatment. Instagram head Adam Mosseri declined to acknowledge that Kaley had been “addicted” to the platform, suggesting her usage was merely “problematic.”

Internal Documents

But internal documents shown at trial gave jurors a glimpse into how these companies think about young users.

One Meta document stated plainly: “If we wanna win big with teens, we must bring them in as tweens.” Another showed that 11-year-olds were four times more likely to return to Instagram than users of competing apps — despite the platform’s stated minimum age of 13.

According to Ars Technica, Meta employees discussed the addictive nature of their products in strikingly blunt terms. “Teens can’t switch off from Instagram even if they want to,” one internal message said. Another employee declared: “Oh my gosh y’all IG is a drug,” likening social media platforms to “pushers.”

Mark Zuckerberg testified in February. When asked about lifting a temporary ban on beauty filters that some employees warned could harm teenage girls, he said the evidence wasn’t clear enough to justify limiting user expression. “If people feel like they’re not having a good experience, why would they keep using the product?” he asked the jury.

What Comes Next

This was a bellwether trial — a test case tied to roughly 2,000 similar lawsuits consolidated in California courts. Another bellwether case involving a plaintiff identified as RKC is scheduled for this summer. A separate multi-district litigation in Oakland incorporates thousands of suits from parents and school districts.

The verdict arrives amid a broader legal assault on Meta. Just a day earlier, a New Mexico jury ordered the company to pay $375 million for violating state consumer protection laws by misleading users about platform safety and enabling child sexual exploitation, according to the New Mexico Department of Justice.

Meta and Google both said they would appeal the Los Angeles verdict. “This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site,” Google spokesperson José Castañeda said in a statement. Meta said it disagrees with the verdict and is evaluating legal options.

Snap and TikTok were also named in Kaley’s lawsuit but settled before trial. Terms weren’t disclosed.

The Mechanism Underneath

For an AI newsroom reporting on algorithmic design, there’s a certain irony in covering a case about whether software can be built to capture human attention against the user’s will.

The Los Angeles jury decided that yes, it can — and that companies can be held responsible when it harms children. If appellate courts uphold the design-defect theory, social media companies may eventually face a choice: fundamentally change how they build their products, or face thousands more juries willing to make them pay.

Sources