A New Mexico judge’s decision to let the state’s child-safety case against Meta proceed has become a defining legal setback for the company, with state officials calling it a major turning point in efforts to hold social platforms accountable for harms involving minors. The case centers on allegations that Meta misrepresented the safety of Facebook and Instagram while failing to curb child sexual exploitation and related risks. This article explains what the court decided, why the ruling matters, and what comes next in one of the most closely watched child-safety battles facing Big Tech.
New Mexico’s Department of Justice said Judge Bryan Biedscheid denied Meta’s motion to dismiss the state’s lawsuit, allowing claims under the state’s consumer protection framework to move forward. The state announced that decision on May 30, 2024, and described it as a significant legal victory in a case focused on whether Meta’s platforms enabled child sexual exploitation and misled users about safety protections, according to the New Mexico Department of Justice.
⚠️
The ruling did not end the case, but it kept New Mexico’s claims alive.
That matters because Meta had argued the lawsuit should be thrown out, including on Section 230 grounds, but the state said the court allowed the case to continue. Source: New Mexico Department of Justice, May 30, 2024.
May 30, 2024 Ruling Keeps New Mexico’s Claims Intact
The legal dispute began in late 2023, when New Mexico Attorney General Raúl Torrez sued Meta, alleging the company’s platforms had become a place where predators could target children and where the company had not fully disclosed what it knew about those risks. Axios reported on December 6, 2023, that the complaint accused Meta of failing to protect children from sexual abuse and of designing products that harmed young users’ well-being.
When Judge Biedscheid denied Meta’s motion to dismiss on May 30, 2024, the court did not rule on ultimate liability. Instead, it found the case could proceed, which is often one of the most important early stages in complex platform-liability litigation. The New Mexico Department of Justice said the lawsuit focuses on Meta’s alleged role in enabling child sexual exploitation and on whether the company violated state law through deceptive or unfair practices.
Key Case Milestones
| Date | Event | Why It Matters |
|---|---|---|
| October 24, 2023 | Multi-state attorneys general sued Meta over harms to children and teens | Broader regulatory pressure on Meta intensified |
| December 6, 2023 | New Mexico filed its own child-safety lawsuit against Meta | Focused specifically on exploitation and safety representations |
| May 30, 2024 | Judge denied Meta’s motion to dismiss | Case survived an early legal challenge |
| February 2026 | Trial opened in Santa Fe | Dispute moved from pleadings to evidence and testimony |
Source: New Mexico Department of Justice; Axios; AP | accessed March 25, 2026
Why New Mexico’s 2026 Trial Raised the Stakes
The case moved into trial in February 2026. AP reported on February 9, 2026, that New Mexico prosecutors argued Meta failed to disclose what it knew about harmful effects on children and that the state had built part of its case through undercover accounts posing as minors to document sexual solicitations and Meta’s responses. AP also reported that the trial focused on alleged violations of New Mexico consumer protection law tied to child sexual exploitation and youth harms.
That trial phase matters because it shifts the dispute from allegations in pleadings to testimony, internal records, and platform evidence. Wired reported in February 2026 that New Mexico argued Meta had misled the public for years about dangers on its platforms, while Meta sought to narrow what evidence could be shown in court. TechCrunch and Ars Technica separately reported in January 2026 that Meta tried to limit references to certain public-health materials, surveys, and other evidence before trial.
How the Case Escalated
December 2023: New Mexico sued Meta, alleging failures tied to child sexual exploitation and platform safety.
May 30, 2024: The court denied Meta’s motion to dismiss, allowing the case to continue.
January 2026: Pretrial fights focused on what evidence jurors or the court could hear.
February 2026: Trial began in Santa Fe, with New Mexico pressing claims that Meta misrepresented platform safety.
What Evidence Drove the “Watershed Moment” Framing?
The phrase “watershed moment” reflects the broader significance of the ruling and trial, not a final damages judgment already entered against Meta. The state’s theory is that Meta’s public safety claims can be tested under state consumer protection law, even as the company argues it has invested heavily in safety systems and content enforcement. That distinction is central: the case is not only about harmful third-party content, but also about what Meta allegedly represented to users, parents, and the public.
Time reported in late 2025 that court filings in related child-safety litigation alleged Meta downplayed risks to children and failed to act on internal warnings. Among the allegations described by Time were claims that millions of adult strangers contacted minors on Meta services and that harmful content involving eating disorders, suicide, and child sexual abuse was frequently detected but not consistently removed. Those are allegations from filings, not final court findings, but they help explain why this litigation has drawn national attention.
ℹ️
Meta disputes the allegations and says it has extensive child-safety systems.
In a January 2026 company post, Meta said it supports parents, works with NCMEC, and disabled more than 2.6 million accounts in 2023 for violating child sexual exploitation policies, according to company and local reporting.
Meta vs. State Prosecutors: 2 Competing Accounts of Platform Safety
Meta’s defense has been that it uses technology, expert teams, and reporting systems to detect and remove abusive material, and that online safety enforcement is an industry-wide challenge. Reporting from Yahoo, citing trial testimony published March 6, 2026, said one Meta witness described the company as an industry leader in identifying and removing child sexual abuse material. Meta has also publicly said it cooperates with law enforcement and responds rapidly to emergency requests.
New Mexico’s position is different. The state argues that safety tools and public statements did not match what Meta knew internally about risks to children. AP reported that prosecutors said Meta engineered algorithms to keep young people engaged while knowing children were at risk of sexual exploitation on social media. That framing is important because it links product design, disclosure, and child safety into one legal theory.
Competing Claims in the Case
| Issue | New Mexico’s Position | Meta’s Position |
|---|---|---|
| Platform safety | Meta misrepresented risks to children | Meta says it has extensive safety measures |
| Child exploitation | Platforms enabled solicitation and abuse risks | Meta says it actively detects and removes violative content |
| Legal theory | State consumer protection and unfair practices claims apply | Meta has argued for legal limits on such claims |
| Evidence | Internal records and undercover testing support the case | Meta has sought to limit some evidence at trial |
Source: AP, Meta, TechCrunch, Ars Technica | accessed March 25, 2026
What March 2026 Means for Meta and Child-Safety Litigation
As of March 25, 2026, the most verified development is that New Mexico’s case survived dismissal and proceeded to trial in 2026, making it one of the most advanced child-safety cases against Meta in state court. AP described it as the first stand-alone case by a state tied to an undercover investigation of Meta over child exploitation risks. That procedural posture gives the case outsized importance compared with many earlier complaints that remained in pretrial stages.
The broader significance extends beyond New Mexico. Separate litigation in California over social media addiction and harms to children has also put Meta under pressure, with AP reporting in February 2026 that Meta and YouTube were the remaining defendants in a landmark Los Angeles trial after TikTok and Snap settled. Together, those cases show that courts are increasingly being asked to examine whether platform design and safety representations can create legal exposure when minors are harmed.
Frequently Asked Questions
Did Meta lose the entire child-safety case?
Not based on the most clearly verified record available here. What is confirmed is that Judge Bryan Biedscheid denied Meta’s motion to dismiss on May 30, 2024, and the case proceeded to trial in February 2026. That is a major legal setback, but not the same as a final merits judgment.
Why is this being called a watershed moment?
The case is significant because New Mexico’s claims survived an early dismissal attempt and advanced to trial, where internal evidence, safety practices, and public representations could be tested in court. AP described the trial as a first stand-alone state case tied to an undercover investigation into Meta and child exploitation risks.
What does New Mexico accuse Meta of doing?
The state alleges Meta misrepresented the safety of Facebook and Instagram, failed to adequately protect children from sexual exploitation, and violated state consumer protection law. Those allegations were reported by AP and Axios and were echoed in the state’s own court-related announcements.
What is Meta’s response?
Meta says it has invested in child-safety systems, works with law enforcement and NCMEC, and removes large volumes of violative accounts and content. In January 2026, Meta said it would defend itself in court against claims it says misrepresent the facts.
When did the New Mexico trial begin?
AP reported that the trial began in February 2026 in Santa Fe, with one report dated February 9, 2026, previewing the proceedings and another describing the case as a seven-week civil trial.
Disclaimer: This article is for informational purposes only. Information may have changed since publication. Always verify information independently and consult qualified professionals for specific advice.






