Meta Ordered to Pay $375 Million in New Mexico Child Safety Case

by Alexandra Agraz | Mar 25, 2026
Photo Source: Adobe Stock Image

A New Mexico jury has ordered Meta Platforms to pay $375 million after finding the company violated state law by misleading users about the safety of its social media platforms and failing to protect children.

The decision followed a six-week trial in Santa Fe. Jurors determined Meta engaged in unfair or deceptive practices and conduct the law defines as unconscionable, concluding the company took advantage of users who lacked full information about risks on its platforms. The jury counted 75,000 violations and set penalties at $5,000 for each.

The state argued Meta reassured users and parents about platform safety while failing to address known risks, including contact between adults and underage users and the spread of harmful content. Officials also pointed to the lack of safeguards, such as effective age verification.

According to state officials, the investigation involved accounts posing as children under 14 that received sexually explicit material and drew contact from adults seeking similar content, leading to criminal charges against several individuals.

Meta disputed the allegations during the trial and said it has invested heavily in safety systems. The company argued it provides disclosures about the limits of content moderation and works to remove harmful material at scale. In a statement after the verdict, a spokesperson said Meta disagrees with the decision and intends to appeal.

The case was brought under New Mexico’s consumer protection law, which prohibits companies from misleading the public or withholding information that could influence consumer decisions. State officials said Meta’s safety claims, together with gaps in its safeguards, gave users and parents an incomplete picture of the risks on its platforms.

Meta argued the case should be barred by Section 230 of the Communications Decency Act, which generally shields online platforms from liability for user-generated content, along with First Amendment protections. The judge rejected those arguments before trial, allowing the case to move forward based on allegations about the company’s conduct and platform design.

The state also challenged how certain features operate on the services, arguing that tools such as infinite scrolling and auto-play video were designed to increase engagement despite known risks to younger users. Meta disputed that characterization and pointed to changes it has introduced, including additional controls for teen accounts.

Meta faces similar lawsuits across the country over how its platforms affect younger users, with many focusing on design features and their potential impact on mental health. A separate jury in Los Angeles is currently considering related claims.

A second phase of the case is set for May, when a judge will hear claims without a jury that Meta created a public nuisance affecting the health and safety of residents. State officials have said they will seek additional financial penalties and request changes to the company’s platforms, including stronger age verification measures.

Share This Article

If you found this article insightful, consider sharing it with your network.

Alexandra Agraz
Alexandra Agraz is a former Diplomatic Aide with firsthand experience in facilitating high-level international events, including the signing of critical economic and political agreements between the United States and Mexico. She holds dual associate degrees in Humanities, Social and Political Sciences, and Film, blending a diverse academic background in diplomacy, culture, and storytelling. This unique combination enables her to provide nuanced perspectives on global relations and cultural narratives.