Teenage girl looking distressed while scrolling social media on her phone in a UK bedroom

Meta and YouTube Found Liable: What the $6M Verdict Means for Parents and Children in the UK

Davis Davis CaesarDigital Law
4 min read March 25, 2026

A US jury has found Meta and YouTube negligent in a landmark social media addiction trial, awarding $6 million in damages to a young woman on 25 March 2026 — the first time a court has formally held a tech platform liable for designing addictive products that harmed a child.

What the Verdict Said

The case was brought by K.G.M., now 20, who alleged that Instagram and YouTube were deliberately engineered to keep her — and millions of children like her — scrolling for hours each day. She started using the platforms as a young child and argued that the compulsive behaviour triggered by their design led to depression, anxiety, body dysmorphia, and a prolonged loss of self-worth.

The Los Angeles jury found both companies negligent in their platform design. Meta was assigned 70% of the responsibility; Google's YouTube bore the remaining 30%. Compensatory damages of $3 million were awarded alongside $3 million in punitive damages — totalling $6 million. The plaintiff's legal team had sought $1 billion in punitive damages. Both companies said they would appeal.

What the Internal Evidence Showed

The most damaging evidence came from Meta's own documents. According to trial testimony, internal memos quoted CEO Mark Zuckerberg discussing strategies to attract users as young as tweens. One note reportedly stated: "If we wanna win big with teens, we must bring them in as tweens."

Meta's own research, presented to the jury on 25 March 2026, showed that 11-year-olds were four times more likely to keep returning to Instagram compared with competing apps — despite the platform's stated 13+ age limit. Researchers within the company studied the addiction mechanics and used their findings to refine engagement features rather than introduce safeguards.

According to the UK's Information Commissioner's Office (ICO), platforms operating in the UK are required to apply a Children's Code — meaning tech companies must already design services to protect children by default.

Why This Matters in the UK

Although the trial took place in California, the legal and social implications extend to British families. The UK has already passed the Online Safety Act, which places new duties on platforms to protect under-18s from harmful content and addictive design patterns. Ofcom began consulting on platform accountability rules in 2025, and enforcement is accelerating.

This verdict sends a clear signal: internal knowledge of harm, combined with deliberate design choices to increase engagement, can constitute negligence — even in the face of decades of legal immunity arguments.

For UK parents who believe their child has suffered measurable harm — depression, self-harm, disordered eating — as a direct result of social media design, this precedent is significant. The case establishes that damages are recoverable when a platform knew about risks and chose engagement over safety.

The Scale of the Litigation

The Los Angeles verdict is not a one-off. More than 350 families and 250 school districts have filed consolidated claims in US courts, covering similar allegations about Instagram, YouTube, TikTok, and Snapchat. Legal teams on both sides of the Atlantic are watching closely.

In England and Wales, group litigation orders allow multiple claimants with similar cases to proceed together. A specialist solicitor in digital law or consumer litigation can advise whether a family's circumstances — documented diagnosis, timestamps of platform use, medical records — meet the threshold needed to bring or join a claim.

What Should UK Parents Do Now?

If your child has been diagnosed with anxiety, depression, or an eating disorder, and you believe compulsive social media use was a contributing factor, consider these steps:

1. Document everything now. Medical records, school absence records, screenshots of usage statistics (available from device screen-time reports), and any mental health referrals should be preserved. Evidence degrades over time.

2. Seek a medical opinion. A GP or child psychiatrist can provide a formal diagnosis and clinical opinion on causation. This is the foundation of any legal claim.

3. Consult a solicitor. A lawyer specialising in digital law or personal injury can assess whether the UK's legal framework — including the Online Safety Act and existing negligence principles — offers a viable route to compensation. Many work on a no-win, no-fee basis.

4. Report to Ofcom. Even without a claim, families can report platforms that appear to be breaching their Children's Code obligations. Regulatory pressure shapes how quickly platforms change.

5. Talk to your child. The verdict confirms what many families already know from lived experience. Acknowledging the harm — without blame — is the first step in recovery.

The Bigger Picture

Meta and YouTube are facing mounting legal exposure on both sides of the Atlantic. The $6 million awarded on 25 March 2026 is modest compared with the punitive damages sought, but the liability finding itself is the legal landmark. Punitive damages in future, larger cases — with more plaintiffs and more documented harm — could be orders of magnitude higher.

For now, this verdict marks a turning point: tech platforms can no longer claim they did not know. The internal emails, the research memos, the age-targeting strategies — all of it is now part of the public trial record.

This article covers a legal matter. It does not constitute legal advice. If you are considering legal action, consult a qualified solicitor.

Our Experts

Advantages

Quick and accurate answers to all your questions and requests for assistance in over 200 categories.

Thousands of users have given a satisfaction rating of 4.9 out of 5 for the advice and recommendations provided by our assistants.