Jury slaps Meta with $375 million penalty in child-safety case
The case could reshape how social media companies are held accountable—and reignite debate over federal liability protections
A New Mexico jury delivered a major blow to Meta Platforms today, ruling that the social media giant violated state consumer protection laws by failing to adequately protect children from predators on its platforms.
The verdict, reached after weeks of testimony in a Santa Fe civil trial, found that Meta willfully engaged in unfair and unconscionable trade practices tied to the safety of apps including Facebook and Instagram. Jurors awarded $375 million in damages based on the number of violations presented by the state.
The decision marks one of the most significant legal setbacks yet for a social media company in a growing wave of cases alleging harm to minors—and could signal a broader shift in how courts and regulators approach platform accountability
Undercover operation sparked the case
The lawsuit was brought in 2023 by New Mexico Attorney General Raúl Torrez following an undercover investigation that created a fake profile posing as a 13-year-old girl.
According to prosecutors, the account was quickly flooded with explicit images and solicitations from alleged child predators—evidence they argued demonstrated systemic failures in Meta’s safety systems.
State attorneys said the company misled users about the effectiveness of its protections, violating New Mexico’s Unfair Practices Act.
“The jury’s verdict is a historic victory for every child and family,” Torrez said after the ruling, accusing Meta of prioritizing engagement and profits over safety. He also alleged that company executives ignored internal warnings about risks to minors.
Meta vows appeal
Meta pushed back strongly against the verdict and said it plans to appeal.
“We respectfully disagree with the verdict and will appeal,” a company spokesperson said in a statement, adding that Meta has invested heavily in safety tools and continues to improve its ability to detect harmful content and bad actors.
The company has consistently denied the state’s allegations, arguing that prosecutors selectively used internal documents to present a misleading picture of its efforts.
Meta also emphasized the broader challenge facing the industry: identifying and removing harmful content at scale while balancing privacy and encryption.
Internal documents take center stage
A key flashpoint during the trial involved internal company communications tied to a 2019 decision to expand end-to-end encryption in messaging services.
Prosecutors highlighted internal discussions suggesting the move could limit Meta’s ability to detect and report child sexual abuse material—potentially affecting millions of reports to law enforcement.
State lawyers argued that such decisions reflected a pattern of prioritizing user engagement and product growth over child safety. Meta countered that encryption is critical for user privacy and security.
What comes next: second phase and broader remedies
The case is far from over.
A second phase of the trial—set to begin May 4—will be decided by a judge and could carry even broader consequences. The court will consider whether Meta created a “public nuisance” and whether it should be required to fund programs addressing harm to children.
State officials are also seeking structural changes, including:
Stronger age verification systems
More aggressive removal of predators
Limits on encrypted communications that could shield abusers
Torrez has framed the case as a potential national model, suggesting that reforms imposed in New Mexico could influence policy across the U.S. and beyond.
A test of Section 230—and a new legal strategy
The case is part of a broader legal shift targeting how social media platforms are designed, rather than focusing solely on user-generated content.
Traditionally, companies like Meta have relied on Section 230 of the Communications Decency Act to shield them from liability for content posted by users. But New Mexico prosecutors said they successfully navigated around those protections by focusing on product design and business practices.
Torrez said the verdict could add momentum to calls in Washington to revisit Section 230, potentially narrowing or rewriting the law.
“I think juries awarding penalties … are an important signal to policymakers,” he said.
Part of a growing wave of lawsuits
The New Mexico case is one of several high-profile trials this year examining the impact of social media on children—often compared by legal experts to the tobacco litigation battles of the 1990s.
Other major cases include:
A Los Angeles trial involving Meta and YouTube, where jurors are weighing claims that platform design contributed to a teen’s mental health harms
A forthcoming federal case in California involving multiple companies—including Meta, TikTok, Snap, and YouTube—brought by school districts and families
Together, these cases could define the next era of tech regulation, particularly around youth safety and platform design.
Why this matters
The verdict underscores a growing willingness by courts and juries to hold tech companies financially accountable for harms tied to their products—especially when children are involved.
For consumers and families, the case raises pressing questions about:
How safe major platforms really are for minors
Whether current safeguards are effective—or merely advertised as such
How much responsibility companies bear for foreseeable harms
For the tech industry, the stakes are even higher. If similar rulings spread, companies could face not only large financial penalties but also court-ordered changes to how their platforms operate.
Bottom line: The New Mexico verdict may be remembered as a turning point—one where juries began treating social media platforms less like neutral tools and more like products that must meet basic safety expectations.



