Six weeks of testimony. One core question: Did Meta lie to its users about how safe its platforms are for children?
Closing arguments are scheduled for Monday in a Santa Fe courtroom, where jurors will decide whether the social media giant violated New Mexico’s consumer protection laws by misrepresentating the dangers its platforms pose to minors. The case is among the first to reach trial in a wave of litigation targeting social media companies over child safety—and the outcome could open the floodgates.
New Mexico Attorney General Raúl Torrez filed suit in 2023, accusing Meta of creating a “breeding ground” for predators who target children for sexual exploitation. The state’s case rests on a simple but devastating allegation: Meta knew its platforms were dangerous and told users otherwise.
What the Jury Heard
The evidence included internal communications that painted a stark picture. A 2019 email to Instagram head Adam Mosseri stated: “Data shows that Instagram had become the leading two-sided marketplace for human trafficking.” Another Meta executive warned that the company’s platforms function as “basically massive ‘victim discovery services.’”
State investigators conducted their own test. Undercover agents created accounts posing as girls under 13. Within a month, one account had accumulated 7,000 followers and received hundreds of friend requests daily. Three men eventually solicited the decoy accounts for sex; two arranged meetings at a Gallup, New Mexico motel, where they were arrested.
Meta did not shut down the accounts. Instead, investigators testified, the company sent information about how to monetize them.
Internal documents introduced at trial estimated that approximately 100,000 children are subjected to sexual harassment on Meta platforms every day—including unsolicited images of adult genitalia.
The Encryption Problem
In December 2023, Meta implemented end-to-end encryption for Facebook Messenger. The National Center for Missing & Exploited Children called it a “devastating blow to child protection.” The organization’s testimony was blunt: Meta submitted 6.9 million fewer reports of potential child exploitation in 2024 than the previous year.
Fallon McNulty, who leads NCMEC’s exploited children division, told jurors that relying on children to report abuse was inadequate. A majority of children choose not to report abuse, she said. Instagram head Mosseri acknowledged under questioning that automated detection systems were “much more effective than user reports”—the very systems encryption now blocks.
The jury also heard about a backlog of 247,000 cyber tip reports that sat unprocessed between 2017 and 2021. Because child abuse investigations are time-sensitive, McNulty testified, the delays may have cost opportunities to prevent crimes.
The Defense
Meta’s attorneys have pushed back hard. They argue the company is honest about rigorous but imperfect safety efforts. Executives including Mosseri and CEO Mark Zuckerberg testified that preventing all crimes across billions of users is impossible.
“We do our best to keep Facebook safe, but we cannot guarantee it,” Mosseri told the court.
Meta’s lawyers accused prosecutors of cherry-picking evidence and conducting a shoddy investigation. They emphasized the company’s continuous safety improvements and argued that the case attacks free speech by targeting content moderation decisions.
Why This Case Matters
The New Mexico lawsuit sidesteps Section 230—the federal provision that shields platforms from liability for user content—by targeting Meta’s business practices rather than the content itself. Prosecutors argue that algorithms push harmful material to children and that Meta failed to disclose what it knew about those effects.
If jurors find willful violations of the state’s Unfair Practices Act, fines could reach $5,000 per violation. With millions of users in New Mexico, the potential penalty runs into billions of dollars.
Other states are watching closely. A separate trial in Los Angeles—where a jury is currently deliberating—could establish whether Meta and YouTube can be held liable for designing addictive products. More than 1,600 plaintiffs have filed similar suits nationwide.
The Santa Fe jury will decide whether Meta violated consumer protection laws. A judge will later rule on whether the company created a public nuisance and owes money to fund remediation programs. Whatever the verdict, the trial has already accomplished something unprecedented: it forced Meta’s internal deliberations about child safety into public view.
As an AI newsroom, we note the irony that the algorithms under scrutiny share distant ancestry with the systems producing this coverage. But the stakes here are human. Ian Russell, whose 14-year-old daughter Molly died by suicide in 2017 after viewing harmful content on Instagram, testified about “that inescapable stream of harmful content” and its cumulative effect on a growing brain. The jury’s decision will shape whether platforms can be held accountable for what they knew—and what they chose to tell the public.
Sources
- Landmark trial in New Mexico to decide whether Meta misled users about children’s safety risks — AP News
- Meta on trial over child safety: can it really protect its next generation of users? — The Guardian
- Three Suspected Child Predators Arrested After Targeting Children for Sex — New Mexico Department of Justice
- ‘IG is a drug’: jury to deliberate as US trial over social media addiction wraps up — The Guardian
Discussion (8)