The New Mexico verdict against Meta handed the state a $375 million win after jurors concluded the company misled the public about risks to children on its platforms, finding thousands of violations under the state’s Unfair Practices Act and relying on internal documents, employee testimony, and undercover accounts to make the case.
This ruling marks a major moment in efforts by state officials to hold big tech accountable for youth safety, and it puts pressure on courts and regulators to treat platform design and safety claims as part of the product when harm to minors is alleged. The jury applied the maximum penalty per violation the state proved, producing the $375 million number that now sits as a formal judgment while Meta plans to appeal.
Prosecutors focused squarely on what Meta built and what it told the public, not just the posts uploaded by users. They argued the company presented itself as safe for younger users even as internal work and warnings showed executives knew their systems could addict and expose minors to sexual exploitation and other harms.
The trial used undercover accounts that posed as children to document how quickly those accounts were shown sexual solicitations and explicit material. State investigators tracked the flow of content and highlighted gaps in the safeguards Meta touted, aiming to show the company’s public statements didn’t match what its platforms actually did in practice.
“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew,” New Mexico Attorney General Raúl Torrez said after the verdict.
That quote landed where it mattered: in the courtroom record and now on the public record any time Meta faces similar scrutiny. The state characterized design choices and safety claims as inseparable from the product itself, a legal framing that allowed the case to sidestep some of the usual protections platforms cite that shield them from liability for user-created content.
More than 40 state attorneys general have pursued related litigation and raised similar concerns about youth mental health and content control. New Mexico’s approach leaned hard on company memos and internal research, plus testimony from current and former employees who described how safety warnings were handled internally and what, if anything, was done in response.
Jurors concluded the company misled the public and took advantage of minors, findings that reject Meta’s insistence that it had been transparent about risks and was actively mitigating them. The verdict does not end the fight: Meta has announced it will appeal and the case will move into later phases to decide remedies, including whether the company must fund mitigation programs or alter platform features.
Meta’s public response repeated the line that it discloses risks and invests in safety tools while acknowledging harmful content can still appear. That position failed to persuade the jury, which found the company’s statements and practices inconsistent with protections it claimed existed for younger users.
“We respectfully disagree with the verdict and will appeal,” a Meta spokesperson said, adding that the company works to remove harmful content and protect users across its platforms
From a Republican perspective, the ruling underscores the need for meaningful accountability when large companies put engagement and growth ahead of child safety. Lawmakers and state officials pushing for stronger enforcement of consumer protections see this verdict as confirmation that platform choices and messaging about safety really matter—and that companies can be held responsible when those messages mislead.
The broader legal landscape is shifting as states test new theories of liability focused on product design and consumer deception rather than only on individual pieces of content. If judges uphold the legal theory used here, tech companies could face more direct exposure in state courts for the way their software recommends and amplifies material to young users.
For parents and communities watching this play out, the case highlights how regulator action, investigations that include undercover testing, and public litigation can change the conversation about what companies owe children and families. It also raises questions about what short- and long-term remedies will look like when courts require changes or mitigation funding from platforms.
The verdict now exists as a concrete determination that a major platform misled the public about child safety, and it will appear in future legal fights as precedent and as a matter of record. With appeals likely, the ultimate outcomes and any mandated changes remain to be decided, but the jury’s finding will be part of Meta’s legal and public narrative going forward.


Add comment