A New Mexico jury ordered Meta to pay $375 million on March 24, 2026, after finding the company violated consumer protection laws by misleading users about platform safety and enabling child exploitation on Facebook, Instagram, and WhatsApp. One day later, a California jury found Meta liable in a separate social media addiction case, awarding $6 million in damages. Two verdicts in two days mark the first time U.S. courts have successfully held Meta accountable for harms to children.
Meta Knew Encryption Would Hide 7.5M Child Abuse Reports—Proceeded Anyway
Unsealed court documents reveal Meta executives warned CEO Mark Zuckerberg in 2019 that implementing end-to-end encryption on Messenger would hide millions of child sexual abuse reports from law enforcement. They did it anyway.
In March 2019, as Zuckerberg prepared to announce default encryption for Messenger, Monika Bickert, Meta’s Head of Content Policy, wrote: “We are about to do a bad thing as a company. This is so irresponsible.” Antigone Davis, then Global Head of Safety, warned that encrypted Messenger would be “far, far worse” for child safety than anything seen before.
A February 2019 internal briefing projected child safety reports to NCMEC would drop from 18.4 million to 6.4 million annually—a 65% decrease. Meta would be “unable to provide data proactively to law enforcement” in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases, and 9 threatened school shootings.
Meta proceeded with encryption. The New Mexico jury found these violations willful, not accidental—Meta knew the consequences and chose profits over child safety.
Operation MetaPhile: Undercover Sting Proved Platforms Enable Predators
New Mexico Attorney General Raúl Torrez didn’t rely on theory. In 2023, his office launched “Operation MetaPhile,” creating fake accounts posing as 13-year-olds on Facebook, Instagram, and WhatsApp. The accounts were “simply inundated with images and targeted solicitations” from child abusers.
The undercover operation led to real arrests. Three New Mexico men were charged, two arrested at a motel where they expected to meet a 12-year-old girl based on conversations with decoy accounts. This wasn’t theoretical harm—it was concrete evidence that Meta’s platforms facilitate, not just host, child exploitation.
The jury awarded $375 million under New Mexico’s Unfair Practices Act—$5,000 per violation. New Mexico became the first state to sue Meta over child safety and win at trial. A second trial phase begins May 4, where a judge will decide whether Meta created a “public nuisance” and should fund child safety programs.
California Verdict Same Week: Design Defect Theory Bypasses Section 230
One day after the New Mexico verdict, a California jury found Meta and Google liable for social media addiction in K.G.M. v. Meta, awarding $6 million in damages. The plaintiff, now 20, developed depression and suicidal thoughts as a minor due to platform design choices—infinite scroll, autoplay, algorithmic feeds.
The jury found platforms were “deliberately built to be addictive” and executives “knew and failed to protect youngest users.” Meta was ordered to pay $3.1 million, Google $900,000.
The legal breakthrough: design defect theory. Plaintiffs argued that infinite scroll, autoplay, and algorithmic recommendations are product design choices, comparable to a car manufacturer using defective brakes. Section 230 protects platforms from liability for user-generated content, but it doesn’t shield product design decisions. By targeting how platforms are built—not what users post—the California case bypassed Section 230 immunity entirely.
K.G.M. is the first of three California bellwether cases testing social media addiction law. More verdicts are coming, and they’re all using the same legal strategy: sue for product design, not content moderation.
Section 230 Immunity Has Cracks When Child Safety Is Involved
For decades, Section 230 of the Communications Decency Act has shielded platforms from lawsuits over user-generated content. Tech companies treated it as an absolute immunity. These two verdicts prove it’s not.
The New Mexico case bypassed Section 230 by suing under consumer protection law—Meta misled users about safety, a business practice violation. The California case targeted product design defects, arguing platforms engineered addictive features knowing they harm children. Neither approach challenges Section 230 directly, but both render it irrelevant when child safety is at stake.
Congress is paying attention. Bipartisan legislation is mounting. Rep. Patronis introduced the PROTECT Act to repeal Section 230 entirely. Sen. Durbin’s Sunset Section 230 Act has bipartisan support. Sens. Cruz and Klobuchar’s Take It Down Act focuses specifically on protecting children from non-consensual intimate images. Child safety is the wedge issue cracking Section 230’s immunity, and courts are moving faster than Congress.
What’s Next: Appeals, More Lawsuits, and the Litigation Wave Meta Can’t Stop
Meta will appeal both verdicts. “We respectfully disagree with the verdict and will appeal,” a spokesperson said. Appeals will take years, but the precedent stands. New Mexico won, California won, and 42 other states are watching.
School districts are eyeing these verdicts closely, considering lawsuits for student mental health harms caused by platform addiction. K.G.M. provides a legal blueprint. Legal experts report administrators are consulting attorneys, viewing platform addiction as an educational crisis with a legal remedy.
For platform builders and developers, the implications are clear. Algorithmic recommendation systems are now a legal liability if they target minors. Age verification and child safety features aren’t optional—they’re legal requirements. Internal communications about known harms are discoverable in court, as Meta learned when its 2019 emails were unsealed. Engagement optimization can be deemed a product defect if it’s designed to addict children.
$375 million is a rounding error for Meta, which posted $40 billion in profit in 2025. But legal precedent isn’t. Two verdicts in two days isn’t coincidence—it’s a reckoning. Courts are rewriting the rules for Big Tech and children, and Section 230 can’t save them anymore.









