On March 24, 2026, a New Mexico jury delivered a verdict that may well define a new era of sexual abuse litigation. After six weeks of testimony, the jury found Meta Platforms, Inc. liable for failing to protect children from sexual predators and misleading users about the safety of its platforms, ordering the company to pay $375 million in civil penalties. The verdict has been called historic, marking the first time a state has successfully sued Meta over child safety issues.
This article examines the legal theories and broader implications that make this ruling a potential watershed moment for clients who have been harmed by social media platforms, and the legal landscape surrounding platform liability more generally.
Consumer Protection, Not Publisher Liability
One of the most significant features of the New Mexico case is the legal framework under which it was brought. Rather than attempting to hold Meta liable as a publisher of user-generated content, which would implicate the broad immunity of Section 230 of the Communications Decency Act, the State of New Mexico framed its claims under the New Mexico Unfair Practices Act (UPA). The complaint explicitly stated that it did not seek to hold Meta liable as a publisher, but rather for Meta’s “deceptive, unfair, unconscionable, unreasonable, and unlawful conduct in designing and maintaining its products” and for “making deceptive statements concerning Meta’s conduct, platforms and policies.”
The UPA prohibits unfair or deceptive trade practices and unconscionable trade practices in the conduct of any trade or commerce. The State advanced four counts against Meta: (1) unfair or deceptive trade practices, (2) unfair trade practices, (3) unconscionable trade practices, and (4) public nuisance. Each violation of the UPA carried a maximum civil penalty of $5,000, and the jury found thousands of individual violations, resulting in the $375 million total penalty.
This approach effectively sidesteps the Section 230 defense that has long shielded technology companies from liability. By recasting platform misconduct as a consumer protection issue—rooted in misrepresentations about safety, the knowing deployment of addictive design features, and the concealment of internal research—this case offers a replicable playbook for attorneys general and private plaintiffs alike.
The Role of Algorithms: From Passive Hosting to Active Harm
Central to the jury’s findings was the role of Meta’s recommendation algorithms. New Mexico argued that Meta’s platforms did not merely host harmful content but actively “steered” young users toward sexually explicit material, child sexual abuse material (CSAM), and even sex trafficking through its recommendation systems. As the complaint alleged, Meta’s algorithms operated to “search and disseminate” sexually exploitative materials and to create social networks connecting users looking to buy and sell such content and the children victimized by it.
By establishing that the design and operation of algorithms can, itself, constitute an unfair or deceptive trade practice, this verdict reframes the debate around platform liability. Companies can no longer argue that they are mere conduits for user content when their own systems are actively curating and amplifying harmful material to vulnerable users.
Corporate Knowledge and Misrepresentation: A Key Liability Driver
The evidence at trial painted a stark picture of a company that publicly proclaimed its platforms were safe while internally documenting the opposite. The complaint cited years of internal documents and testimony demonstrating that Meta knew about the harms its platforms caused and chose not to act.
Court documents unsealed during the case included an internal email warning that there could be as many as 500,000 cases of online sexual exploitation per day on Facebook and Instagram. Yet Meta’s public-facing “prevalence” metrics consistently reported low percentages of offensive content, metrics that the company’s own internal study contradicted, showing users were “100 times more likely to tell Instagram they’d witnessed bullying in the last week than Meta’s bullying prevalence statistics indicated.”
This gap between public representation and internal knowledge, what New Mexico characterized as a pattern of misrepresentation, omission, and active concealment, formed the backbone of the consumer protection claims.
A New Era of Tech Accountability: Private Actions
The New Mexico verdict, combined with a contemporaneous Los Angeles jury verdict, signals what advocates are calling a new era for technology accountability. These verdicts may finally provide a path forward for individual victims who have long been stymied by Section 230 immunity.
A critical question is whether individual victims, not just state attorneys general, have a private right of action under consumer protection frameworks. Most state consumer protection statutes, including the New Mexico UPA, provide both public enforcement mechanisms (through the attorney general) and private rights of action for aggrieved consumers. Under NMSA 1978, § 57-12-10, any person who suffers actual damages as a result of conduct prohibited by the UPA may bring a civil action for treble damages or $100, whichever is greater, plus attorney fees and costs.
This suggests that children and families who have suffered harm from Meta’s platforms may be able to pursue their own claims using the same consumer protection framework that succeeded in the New Mexico state action. The New Mexico complaint itself noted that claims were brought under the UPA, which prohibits “unfair or deceptive trade practices and unconscionable trade practices in the conduct of any trade or commerce.” The jury’s findings that Meta’s conduct constituted deceptive, unfair, and unconscionable trade practices could provide significant support for private plaintiffs seeking to establish similar claims, where facts support that such prohibited practices proximately caused abuse of their child.
The UPA framework is particularly well-suited for private actions because it captures the full range of Meta’s alleged misconduct. The statute reaches conduct that “takes advantage of the lack of knowledge, ability, experience or capacity of a person to a grossly unfair degree”—language directly applicable to vulnerable minor users. Private plaintiffs can point to Meta’s own internal knowledge that its platforms took advantage of children’s developmental vulnerabilities and their “inability to self-regulate” as evidence satisfying this standard.
The Power of Per-Violation Penalties
The $375 million penalty, though substantial, was far below the $2 billion that New Mexico attorneys had originally sought. The total was reached because the jury found thousands of individual violations of the UPA, each carrying a maximum penalty of $5,000. The complaint also sought disgorgement of profits, injunctive relief, attorney fees, and pre- and post-judgment interest.
The per-violation penalty structure is particularly significant when projected onto future cases. Given the scale of Meta’s user base, with an average of 3.14 billion daily active users as of September 2023, the potential exposure in litigation involving millions of affected users is staggering. Any state with a comparable consumer protection statute could theoretically pursue similar claims, and the financial stakes could dwarf the New Mexico verdict.
The Broader Litigation Landscape
The New Mexico verdict does not exist in isolation. Meta was found to be liable in a separate case in Los Angeles, where a young woman claims she became addicted to platforms like Instagram and YouTube as a child because of how they are intentionally designed. That case focused on addiction-based harm rather than sexual exploitation, but it implicated the same corporate knowledge and design-choice theories.
Beyond sexual exploitation, the New Mexico case included a public nuisance claim alleging that Meta’s platforms contributed to increases in youth suicide, depression, eating disorders, bullying, and social media addiction. The Surgeon General of the United States issued an advisory in May 2023 warning that adolescents who spent more than three hours per day on social media faced double the risk of experiencing poor mental health outcomes.
A successful public nuisance theory could empower state governments to seek abatement orders requiring platforms to fundamentally redesign how they serve minor users, a far more consequential remedy than monetary damages alone. The complaint sought injunctive relief, abatement of the public nuisance, and payment of monies to the State to abate the nuisance, in addition to the UPA penalties.
Conclusion
The New Mexico verdict against Meta represents a potential turning point in the law of platform liability and sexual abuse litigation by demonstrating that technology companies can be held accountable for the foreseeable harms of their design choices. This case lays the groundwork for a new era of litigation on behalf of children and families harmed by social media platforms. With thousands of similar cases pending nationwide and Meta’s appeal still to come, this is an area of the law that will demand close attention from practitioners and clients alike in the months and years ahead.