In a case involving a teenage user’s sadness, a jury in a Los Angeles courthouse in March 2026 found Meta and YouTube responsible and granted damages. It was the first time a jury had considered the facts, heard the arguments, and returned with a finding regarding whether the design of a social media platform led to a child’s decline in mental health.
The verdict was a bellwether trial, which means that it was deliberately chosen to test legal theories and plaintiff claims prior to the court proceedings of the much bigger wave of cases behind it. The wave is quite large. As of May 2026, there were thousands of lawsuits outstanding in federal multidistrict litigation. The bellwether was intended to indicate potential outcomes. Meta was not pleased with the response it provided.
| Category | Details |
|---|---|
| March 2026 LA Verdict | Los Angeles jury found Meta and YouTube negligent in a bellwether trial — awarded damages for a user’s depression; Meta responsible for the majority |
| New Mexico Judgment | Separate March 2026 verdict — Meta ordered to pay damages for failing to protect users from sexual exploitation; being appealed |
| Early Settlements | Snap Inc. and TikTok settled with plaintiffs before trials began in early 2026 |
| Legal Theory Shift | Plaintiffs arguing “design defects” — infinite scroll, algorithmic recommendations — as intentionally addictive features, not just harmful content |
| School District Claims | Hundreds of school districts joined federal MDL — arguing social media created an “unprecedented mental health crisis” forcing resource reallocation to counseling services |
| Key Internal Evidence | “Project Myst” — internal Meta research cited by plaintiffs as evidence the company knew about harm to young users and did not act |
| Defense Position | Tech companies cite Section 230 protections and safety feature development; argue responsibility lies primarily with parents, not platforms |
| Upcoming Trial | Federal bellwether trial for school district claims — June 2026; Breathitt County and other districts as lead cases |
| Scale of Litigation | Thousands of cases pending in federal court multidistrict litigation as of May 2026 |
| Further Context | Youth mental health and technology policy at U.S. Surgeon General Social Media Advisory |
There have been significant changes in the legal strategy underlying these cases. Due to Section 230 of the Communications Decency Act, which shields platforms from liability for user-generated material, early social media cases tended to concentrate on particular bad content, such as the post that promoted self-harm or the group that enabled abuse. This defense slowed the litigation for years and was successful in many early cases. Attorneys for the plaintiffs reorganized.
The current wave is based on a different argument, which is that the platforms purposefully created addictive features that constitute poor product design, such as infinite scroll, algorithmic recommendation loops, and notification systems designed to draw users back repeatedly, rather than hosting poor content. That argument is independent of any particular user’s post. The platform’s decisions about how to keep users interested for longer than is beneficial to them are decided at engineering meetings and product reviews.
Plaintiffs now have proof that the firms knew about the injury thanks to the internal document angle. The company’s own data scientists were purportedly able to discover detrimental effects of Instagram use on teenage girls’ mental health through Meta’s “Project Myst” research, which was widely mentioned in filings.
However, the research did not result in significant product adjustments. The legal and moral stance changes when an organization’s internal studies reveal an issue and the business moves forward without significantly resolving it. It becomes more difficult to claim that the harm was unexpected. Meta is appealing the March ruling on several grounds, including Section 230, which is more difficult but not impossible.
A distinct aspect of the litigation is represented by school districts. Hundreds of them have joined the federal MDL, claiming that social media platforms have caused a mental health crisis that is evident in school counselors’ offices on a daily basis, as evidenced by the rise in demand for specialized services and the shift of funds from academic programming to crisis response.

Breathitt County is one of the lead cases in a federal bellwether trial for school district claims scheduled for June 2026. The financial liability for Meta and others goes far beyond individual personal harm claims if school districts can prove that platforms owe them damages for the institutional costs of handling a teenage mental health crisis that the platforms contributed to.
It’s difficult to ignore the fact that Snap and TikTok reached a settlement prior to trial, which is what businesses usually do when the danger of a jury verdict outweighs the expense of settlement. Google and Meta decided to go to trial in Los Angeles.
The first one was lost. The date of the second big trial has already been set. As this builds up in courtrooms in 2026, there’s a feeling that something has changed in the way these businesses are being held responsible. It’s not permanent or definitive, but it’s moving in a direction that appeals will likely improve but not completely undo.