Jury finds Meta and YouTube built ‘addiction machines’ that harmed a child, awards $3 million in landmark trial


Jury finds Meta and YouTube built ‘addiction machines’ that harmed a child, awards $3 million in landmark trial

A Los Angeles jury has found Meta and Google liable for intentionally building addictive social media platforms that harmed a young woman’s mental health, awarding her $3 million in compensatory damages in what is the first trial verdict of its kind in the United States. The decision, delivered on Wednesday after a five-week trial, found Meta responsible for 70 per cent of the plaintiff’s harm and YouTube for the remaining 30 per cent. Punitive damages, which could reach $30 million under California law, are still to be determined.

The plaintiff, identified as Kaley, is now 20 years old. She testified that she began using YouTube at age six and Instagram at age nine, encountering no age verification barriers on either platform. She described spending entire days on social media as a child, withdrawing from her family, and developing anxiety, depression, and body dysmorphia, a condition in which a person becomes obsessively preoccupied with perceived flaws in their appearance. She said she began using Instagram filters that altered her facial features almost as soon as she started using the platform.

The verdict landed one day after a separate jury in New Mexico ordered Meta to pay $375 million for violating state consumer protection law by failing to protect children from sexual predators on its platforms. Together, the back-to-back rulings represent the first time juries have held social media companies financially liable for the harms their products cause to young users.

What the jury decided

Kaley’s lawyers argued that features such as infinite scroll, autoplay, and algorithmically curated content feeds were deliberately designed to maximise engagement and that Meta’s internal growth targets explicitly sought to acquire young users because they were more likely to remain on the platform for longer periods. They presented testimony from former Meta executives and internal company research showing that Meta knew children under 13 were using its platforms and that its products were linked to negative mental health outcomes in teenagers.

When Mark Zuckerberg testified before the jury in February, he acknowledged the issue but said he “always wished” the company had made faster progress on identifying underage users. He maintained that Meta had reached “the right place over time.” Adam Mosseri, the head of Instagram, was presented with data showing Kaley’s longest single-day session on the platform lasted 16 hours. He declined to call it addiction, describing it instead as “problematic.”

Lawyers for Meta argued that while Kaley had experienced genuine mental health struggles, her use of Instagram did not cause or meaningfully contribute to them. Meta said it “respectfully disagrees with the verdict” and is evaluating its legal options. Google called the case a mischaracterisation of YouTube, describing it as “a responsibly built streaming platform, not a social media site,” and said it plans to appeal.

The cases behind this one

Kaley’s was the first of more than 1,500 similar cases consolidated in federal multidistrict litigation against Meta, Google, Snap, and TikTok. Both Snap and TikTok reached undisclosed settlements with Kaley before the trial began, leaving Meta and Google as the two defendants that chose to fight the case in court.

The New Mexico verdict, though legally separate, reinforced the same underlying claim: that Meta knew its platforms endangered children and chose not to act. That case originated from a 2023 undercover operation by New Mexico Attorney General Raúl Torrez, who created a fake profile of a 13-year-old girl and found it was quickly inundated with sexually explicit material and contact from predators. The jury found Meta liable on all counts, including for willfully engaging in unfair, deceptive, and unconscionable trade practices.

Another federal trial is scheduled for June in California, and hundreds of additional cases brought by school districts and state attorneys general are queued behind it. Legal commentators have compared the litigation wave to the tobacco industry lawsuits of the 1990s, which ultimately resulted in a $206 billion settlement and fundamentally reshaped how cigarettes were marketed and regulated in the United States.

What changes and what does not

The immediate financial impact on Meta and Google is minimal. A $3 million compensatory award, even if punitive damages push it toward $30 million, is trivial for companies with a combined market capitalisation exceeding $3 trillion. The $375 million New Mexico verdict is larger but still represents a fraction of a single quarter’s revenue for Meta.

The significance is precedential, not financial. The Los Angeles verdict establishes that a jury of ordinary citizens, presented with internal documents, expert testimony, and the companies’ own research, concluded that social media platforms were intentionally designed to be addictive and that this design caused measurable harm to a specific individual. That finding will be cited in every one of the 1,500 pending cases. It shifts the burden: Meta and Google now enter each subsequent trial not as defendants facing novel claims but as companies a jury has already found liable.

Mike Proulx, a research director at Forrester, described the verdicts as a “breaking point” between social media companies and the public. Whether they also mark a breaking point in how these platforms are built remains to be seen. The features Kaley’s lawyers identified as harmful, infinite scroll, autoplay, algorithmic feeds, and engagement-maximising recommendation systems, are not incidental design choices. They are the business model. Removing them would require Meta and Google to become fundamentally different companies, which is something no jury verdict, however large, can compel.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with