New Mexico Finds Meta Liable on Every Count. The $375 Million Verdict That Reframes Platform Accountability.

On 24 March 2026, after nearly seven weeks of trial in a Santa Fe courtroom, a New Mexico jury found Meta liable on every count brought by the state's Attorney General. The jury awarded $375 million in civil penalties, the maximum available under New Mexico's Unfair Practices Act at $5,000 per violation, after finding that Meta had willfully engaged in unfair and deceptive trade practices and knowingly designed its platforms in ways that harmed children.

On 24 March 2026, after nearly seven weeks of trial in a Santa Fe courtroom, a New Mexico jury found Meta liable on every count brought by the state's Attorney General. The jury awarded $375 million in civil penalties, the maximum available under New Mexico's Unfair Practices Act at $5,000 per violation, after finding that Meta had willfully engaged in unfair and deceptive trade practices and knowingly designed its platforms in ways that harmed children.

It is the first time a US state has prevailed at trial against a major technology company for harming young people. And it is the first time a jury has held Meta liable on these grounds.

The Investigation That Brought the Case to Trial

The lawsuit was filed in 2023 by Attorney General Raúl Torrez, following an undercover operation in which the state created fake social media profiles presenting as children, including one presenting as a 13-year-old girl.

The fake account was immediately inundated with sexual solicitations, requests for graphic material, and a rapid growth in followers, predominantly adult men from across the United States. When that pattern triggered a platform response, the response was not a safety intervention. Meta sent the account information on how to grow its user base and how to monetise its following.

Three adult New Mexico men were subsequently arrested in May 2024 after contacting the fake child accounts. Two were arrested at a motel where they had arranged to meet a 12-year-old girl.

What the Trial Evidence Established

The trial ran from 9 February to 24 March 2026. Evidence included internal Meta documents, testimony from former Meta employees, law enforcement officials, and New Mexico educators.

One strand of evidence drew particular attention. Internal messages disclosed during proceedings showed that when Mark Zuckerberg announced in 2019 that Facebook Messenger would move to end-to-end encryption by default, the decision raised immediate concerns within the company. The documents revealed that the change was projected to eliminate visibility over approximately 7.5 million child sexual abuse material reports per year that would otherwise have been disclosed to law enforcement. A senior Meta content policy executive described the encryption plan, in writing, as "so irresponsible."

The jury found that Meta proceeded regardless.

The Legal Basis and Why It Matters

New Mexico did not bring this case under a technology-specific regulatory framework. It brought it under the state's general consumer protection law, arguing that Meta had misled consumers about the safety of its platforms.

The jury's finding that Meta willfully violated that law, combined with the internal evidence showing what the company knew about child safety risks, establishes that the gap between platform safety representations and platform safety reality constitutes deceptive trade practice under existing law.

This is a significant point. It does not require new legislation. Consumer protection statutes that already exist across every US state are capable of reaching platform design decisions when the evidence shows deliberate misrepresentation.

What Comes Next

The $375 million verdict does not conclude the legal proceedings. New Mexico will now argue a public nuisance case before the same judge, seeking additional orders that could require Meta to implement effective age verification, remove identified predators from its platforms, and make structural changes to how encrypted communications interact with child safety obligations.

The state had initially asked the jury for more than two billion dollars in penalties. The $375 million represents the ceiling under existing civil penalty law at $5,000 per violation. The public nuisance phase allows the court to seek structural remedies that a civil penalty cannot.

Meta has said it will appeal.


HumanSafe Opinion

The following reflects HumanSafe Intelligence's position on this development.

The New Mexico verdict is not primarily about the scale of the penalty. It is about what the trial evidence established: that Meta's internal documentation contained explicit awareness of the child safety consequences of its design decisions, and that those decisions were made regardless.

The internal messages about end-to-end encryption are the clearest illustration. The company was told, in writing, by its own employees, that a specific design decision would eliminate visibility over millions of child safety reports per year. It proceeded. The jury found that this constitutes wilful deceptive trade practice. Not negligence. Not oversight. Wilful conduct.

The constitutional question this verdict raises is one courts are increasingly confronting: when does a design decision become a rights violation? The New Mexico case answers it in a specific context, with a specific evidentiary standard. A design decision made with internal knowledge of its harm, inconsistent with public representations about platform safety, and resulting in documented harm to identifiable individuals, meets that standard under existing consumer protection law.

Consumer protection legislation was not designed with platform architecture in mind. The question this verdict leaves open is whether state-by-state consumer protection claims, prosecuted case by case, are the appropriate mechanism for accountability at the scale at which these platforms operate, or whether the architecture of accountability itself needs to be as universal as the platforms it is meant to govern.


Sources


Share LinkedIn X

Continue reading