$375 million. That’s what a New Mexico jury decided Meta owes for misleading the public about the safety of its platforms for children. For a company that reported over $62 billion in revenue last year, it barely qualifies as a rounding error.
But starting Monday, the real bill comes due.
Attorneys for Meta and New Mexico Attorney General Raúl Torrez return to a Santa Fe courthouse for a three-week bench trial that could force the company to fundamentally restructure how Facebook, Instagram, and WhatsApp operate in the state — and potentially well beyond it.
The Remedy Phase
Torrez isn’t just asking for more money. He’s asking a judge to order structural changes to Meta’s products: mandatory age verification for New Mexico users, a prohibition on end-to-end encryption for users under 18, a 90-hour monthly usage cap for minors, limits on engagement-boosting features like infinite scroll and autoplay, and a requirement that Meta detect 99 percent of new child sexual abuse material (CSAM) on its platforms.
“From the outset, our goal was to try and change the way the company’s doing business,” Torrez told The Verge. “I recognize that even at $375 million for a company this big and this profitable, it’s not enough in and of itself to change the way they’re doing business. In fact, there’s probably some folks in that company who think of it as the cost of doing business.”
Judge Bryan Biedscheid will decide the case without a jury, evaluating which proposals are relevant and feasible — a process that could take considerably longer than the seven-week jury trial that produced March’s verdict.
Precedent in a Santa Fe Courtroom
The changes would formally apply only to Meta’s operations in New Mexico. But the implications are architectural. The company could implement them nationwide for operational simplicity. Or, as Meta has warned, it could simply withdraw from the state entirely.
A court order mandating specific product design changes would also send a signal to every social media company facing similar litigation — and there are thousands of such cases working through US courts. Meta and YouTube are currently standing trial in a separate products liability case in Los Angeles. States, municipalities, and school districts across the country are pursuing similar claims.
Meta warned investors last week that legal and regulatory blowback in the EU and the US “could significantly impact our business and financial results.”
The Encryption Problem
Several of Torrez’s proposed remedies touch on the most contentious debates in technology policy.
Mandatory age verification would almost certainly require collecting more personal information from both adults and minors — something privacy advocates have consistently warned can make users less safe. Don McGowan, formerly of the National Center for Missing and Exploited Children, said barring encryption on Messenger “is a great way to make sure that nobody uses Facebook Messenger anymore and just moves their activity to other platforms that aren’t touched by this lawsuit.”
Meta argued in a legal filing that the 99 percent CSAM detection mandate is effectively impossible to prove, since calculating the rate would require detecting 100 percent of CSAM just to establish the denominator.
Peter Chapman, associate director of the Knight-Georgetown Institute, said there could be “significant tradeoffs” to an encryption prohibition. He pointed to evidence showing Meta’s own profile recommendations were connecting adults and minors — a feature with a clearer danger and a less defensible purpose. “There’s an opportunity to intervene at that level and try to prevent more of these harmful interactions from taking place without having to tackle encryption,” Chapman said.
A Turning Point or a Speed Bump
Meta spokesperson Chris Sgro said the state’s proposed mandates “infringe on parental rights and stifle free expression for all New Mexicans” and called the AG’s focus on a single platform “a misguided strategy.” The company says it has already launched 13 safety measures in the past year, including Teen Accounts on Instagram and parental alerts for self-harm content.
But the evidence that produced March’s verdict was damning. Internal Meta research found that 16 percent of Instagram users had reported being shown unwanted nudity or sexual activity in a single week, according to the BBC. Former engineering leader Arturo Béjar testified that his own daughter was propositioned for sex by a stranger on Instagram. The NM DOJ’s investigation uncovered repeated internal warnings about dangers on Meta’s platforms that went unheeded.
Torrez has broader ambitions than one company or one state. He recently traveled to Washington to advocate for an overhaul of Section 230, the federal law shielding platforms from liability for user-generated content. “If Section 230 were not something that these companies could hide behind, then it increases the chances that they’re going to have to actually make their case to a jury,” he said.
Whether regulation through litigation works remains an open question. But as Chapman noted, there is precedent for exactly this approach. “Whether that’s tobacco, opioids, e-cigarettes, there is precedent for legal action moving a broader policy conversation.”
The next three weeks in Santa Fe will determine whether that conversation reaches Meta’s engineering roadmaps — or stops at the courtroom door.
Sources
- Meta’s historic loss in court could cost a lot more than $375 million — The Verge
- New Mexico Department of Justice Wins Landmark Verdict Against Meta — New Mexico Department of Justice
- New Mexico trial citing ‘public nuisance’ laws against Meta, social platforms — South China Morning Post
- Meta hit with $375M verdict in New Mexico child safety case — Politico
- Meta told to pay $375m for misleading users over child safety — BBC News
Discussion (10)