A Jury Just Found Meta and YouTube Liable for Teen Social Media Addiction. Here’s What That Means.
This week, a Los Angeles jury made history.
On March 25, 2026, jurors found Meta (Instagram) and Google (YouTube) liable for deliberately designing their platforms to be addictive — and for knowingly failing to protect the youngest people using them. The jury awarded more than $6 million in damages to Kaley, a 20-year-old from Chico, California, who first started using YouTube at age 6 and Instagram at age 11.
Meta was assigned 70% of the responsibility. Google, 30%.
This is the first time a jury has decided that tech companies bear legal liability for the harms teens experience after sustained, compulsive use of their platforms. And it will not be the last. Over 2,000 similar lawsuits are currently pending.
What the Jury Actually Found
This case was not about whether social media can be harmful. That’s been debated for years. This verdict established something more specific: that Instagram and YouTube were deliberately engineered to maximize engagement, that the companies’ executives knew this caused harm to young users, and that they chose not to change it.
That distinction matters. It shifts the conversation from “is social media bad?” to “did these companies make a choice that harmed children, and should they be held accountable for it?”
The jury said yes — twice.
Why This Verdict Matters Beyond the Courtroom
The $6 million figure is symbolic compared to Meta and Google’s revenues. What is not symbolic is the signal this sends.
For years, both companies have operated under a legal shield: Section 230 of the Communications Decency Act, which broadly protects online platforms from liability for third-party content. This case found a path around that shield by focusing not on what content teens saw, but on how the products were designed — specifically, on features built to keep users scrolling longer, regardless of the cost to their mental health.
That opens a door. With 2,000+ cases still in the pipeline, and juries now willing to hold these companies accountable, the legal and regulatory environment around teen social media safety has shifted in a real and lasting way.
The Pattern Behind the Verdict
Kaley’s story is not unusual. It is, in fact, the norm.
Adolescents who use social media more than three hours a day are twice as likely to report poor mental health, including depression and anxiety, according to a 2023 U.S. Surgeon General advisory. The 2026 World Happiness Report concluded directly that “social media is not safe for adolescents.”
And the numbers on teen mental health have moved in one direction — downward — since the early 2010s, when smartphone adoption among teens accelerated.
Platforms respond to this data by pointing to their safety features: content filters, age restrictions, parental controls. But Meta’s own internal research has found that these controls have “minimal impact” without active teen buy-in. You cannot build trust through restriction.
What Courts Can’t Fix
Verdicts like this one matter. But courts move slowly. Cases take years. And in the time between the filing of a lawsuit and a jury decision, millions of teenagers log another hour on TikTok, another session on Instagram Reels, another late night on YouTube.
Legislation is moving too. The UK government announced this week a six-week pilot testing social media bans, time limits, and curfews on 300 teenagers. Australia has already passed a social media ban for under-16s. Regulators on multiple continents are paying attention.
But bans create a different problem — one that showed up in every study ever done on this topic: teens bypass them. Within days. Not because they’re defiant. Because the bypass is easy, the platforms are compelling, and no one has given them a better alternative.
The Real Question
The verdict against Meta and YouTube is, at its core, a statement about design choices. Those companies chose to build products that prioritized engagement over wellbeing. A jury decided that choice had a cost.
The question for everyone — parents, educators, policymakers, and product builders — is what a different design choice looks like.
At Xaidus, we think it starts with putting teens in control of their own experience. Not locking them out. Not watching them. Giving them agency over what they consume, showing their parents a picture of progress rather than surveillance, and building in the kind of trust that doesn’t get bypassed.
The platforms were designed to take control away from users. We’re building the tool that gives it back.