Meta, Google await jury decision in crucial social media addiction trial
The outcome could open the floodgates for more litigation and enormous financial penalties
If they had fingernails, Meta and Google would be biting them. They’re awaiting a jury’s decision in a lawsuit that alleges they intentionally designed their social media products to be addictive and that consumers have suffered harm as a result.
The case is seen as a “bellwether” because the outcome could affect numerous other cases around the country and could encourage additional lawsuits if the plaintiffs win the currently pending case.
The lawsuit is centered on a woman identified in court only as K.G.M. She’s now 20 years old but claims that she became “hooked” on social media when she began using Google’s YouTube at age 6 and Meta’s Instagram at 9. She said her compulsive use caused depression, anxiety, self-harm and body image issues.
The claim is not just that social media is harmful but that Google and Meta intentionally designed their products to be addictive, especially for children through the use of:
infinite scroll
autoplay videos
push notifications and
“likes” and algorithmic feeds.
Lawyers argued that the companies had purposely “engineered addiction” into the programs. Evidence introduced in the case has included internal documents purporting to show that company executives were aware of the addictive effects.
Companies deny everything
Meta and Google executives have testified that there is no proof that their services were to blame for the problems, saying that mental health issues are complex and may stem from family, personal or environmental factors.
They argue that parental controls, age limits and moderation tools are built into the products and have prevented any issues in millions of users.
Stakes are high
Thousands of similar lawsuits have already been filed by families, schools and states. If the jury finds for the plaintiffs, it will open the floodgates for more adverse decisions and potentially enormous financial consequences and legal restrictions.
Some lawyers have compared the case to early litigation around tobacco’s addictive and harmful effects, which eventually resulted in a master settlement agreement of more than $200 billion in 1998.
VOICES: The Social Media on Trial
Plaintiffs & Advocates: “Engineered addiction”
“Addicting the brains of children.” (NBC Los Angeles)
Trial lawyers argue platforms were deliberately designed to hook kids, comparing them to:
Casinos
Addictive drugs
Internal documents introduced at trial reportedly described Instagram as:
“like a drug… we’re basically pushers.” (NBC Los Angeles)
Broader lawsuits frame this as a systemic issue:
Companies knew kids were vulnerable
Engagement tools (likes, scroll, autoplay) were built to maximize time, not safety
Mental Health Experts: “Real risks, hard to prove”
Experts testifying in and around these cases say heavy use is linked to:
Anxiety
Depression
Body image issues
A recent global study found:
Algorithm-heavy platforms like Instagram are associated with lower well-being among young users (The Guardian)
👉 But the key caveat:
Proving direct causation (that platforms caused harm) remains scientifically difficult
Tech Companies (Meta & Google): “Not addiction”
Defense attorneys argue:
There is no consensus that social media is addictive
Usage varies widely — in this case, lawyers pointed to relatively modest daily use
In court, executives pushed back on the framing:
Instagram head Adam Mosseri suggested heavy use is more like:
“binging… like a Netflix show” (TechRadar)
Core defense message:
Platforms are tools, not causes
Personal, family, and mental health factors matter more
Regulators & Lawmakers: “Profit over safety”
State prosecutors and regulators argue companies:
Prioritized engagement and growth over child safety
Failed to act on known risks
In a parallel case, authorities allege platforms:
Misled users about risks
Enabled harmful content and exploitation
Put profits ahead of protections (AP News)
Critics & Whistleblowers: “They knew”
Former employees and internal research have fueled criticism that:
Companies were aware of harms to teens
Warnings were downplayed or ignored
Evidence presented in related trials includes:
Concerns about body image effects
Algorithmic amplification of harmful content
Weak enforcement of safety systems (The Guardian)
Industry Outliers: “Ban it for kids”
Some tech leaders are now breaking ranks:
Pinterest CEO Bill Ready called for:
A ban on social media for users under 16 (Reuters)
That’s a notable shift — from defending platforms to questioning whether kids should use them at all.
The Big Divide
Across all voices, the debate boils down to two competing narratives:
1. “Designed harm”
Platforms intentionally exploit psychology
Kids are uniquely vulnerable
Companies should be held liable
2. “Complex reality”
Mental health is multi-factor
Social media is one influence among many
Responsibility lies with users, families, and society
Why this matters
This isn’t just one case — it’s a test of a new legal theory:
👉 Can tech companies be held liable for product design that affects mental health?
That question is now in the hands of a California jury — and the answer could reshape the entire social media industry.



