About
#News
16.04.2026 Social Networks

Can social media impact your health? Case against Meta and YouTube reignites debate

Researcher compares the impact of Big Tech to that of the tobacco industry and advocates for new regulatory framework to protect children and adolescents

Two young Black women lie on the floor of a room, propped up on their elbows. The woman on the left, with her hair tied back and wearing an orange T-shirt, is reading a book. The woman on the right, with her hair down and wearing a teal T-shirt, is smiling as she uses a tablet. The business model of digital platforms, built on capturing attention, may come into direct conflict with principles of health promotion, experts say | Image: Unsplash

A Los Angeles jury found Meta—the parent company of Facebook, Instagram, and WhatsApp—and YouTube liable for damage to the mental health of a 20-year-old woman who alleged she developed severe disorders after years of heavy use of the platforms during her childhood.

The verdict ordered the payment of $6 million in damages: $3 million in compensatory damages and $3 million in punitive damages, with 70% of the liability assigned to Meta and 30% to YouTube, which is owned by Google.

Experts view this decision as a potential turning point for the technology industry, one that could pave the way for similar lawsuits. The case has mobilized both legal and technological communities, further fueling scientific debate over the limits of corporate liability in the health sector.

Understanding the case: Meta, YouTube, and adolescent mental health

What was at issue?

A Los Angeles jury reviewed a lawsuit filed by a 20-year-old woman who alleges she developed severe mental health disorders as a result of heavy use of Instagram and YouTube during her childhood.

What was the verdict?

The jurors found Meta and YouTube liable and ordered them to pay $3 million in damages. Meta will cover 70% of the amount, while YouTube will cover the remaining 30%.

Why is this case considered a precedent?

Decisions of this kind are rare and could pave the way for new lawsuits against platforms for psychological harm caused to young users.

What does the science say?

Researchers such as Ilona Kickbusch argue that the platforms’ business model, based on capturing attention, is structurally incompatible with the promotion of health, particularly among children and adolescents.

What historical comparison do experts draw?

The debate recalls the regulatory evolution of the tobacco industry, which operated for decades without facing effective accountability for the harm it caused to public health.

Social media as a public health risk

The verdict echoes arguments raised in an opinion piece published on April 7 in the journal BMJ by researcher Ilona Kickbusch, director of the Digital Transformations for Health Lab (DTH-Lab) at the University of Geneva, in Switzerland.

In the article, Kickbusch argues that the practices of major digital platforms constitute new health risks, particularly among young people, by influencing behavior, mental well-being, and consumption patterns.

According to the author, the issue calls for an update to regulatory and legal strategies—similar to what occurred decades ago with the tobacco industry

Holding tech giants legally accountable for negative health impacts would, in her view, mark a new phase in the relationship between public health, regulation, and corporate power.

“Seeing major tech corporations being held legally accountable for negative health impacts marks a new phase in the relationship between public health, regulation, and corporate power,” wrote Ilona Kickbusch in an article in the BMJ.

An architecture designed to keep users connected

Kickbusch points out that, unlike other industries historically associated with health risks, digital platforms operate in an environment that remains largely unregulated, one in which negative impacts tend to be diffuse, cumulative, and difficult to measure. 

The business model of these companies, based on capturing attention and maximizing screen time, is, according to the researcher, at the core of the problem.

Mechanisms such as constant notifications, recommendation algorithms, autoplay videos, and intermittent reward systems are not neutral elements, but rather components of an architecture deliberately designed to keep users engaged. 

In the case of children and adolescents, this design can amplify preexisting vulnerabilities.

This analysis aligns with data that help quantify the extent of this exposure: adolescents spend, on average, about two and a half hours per day on social media, according to the World Happiness Report produced by the United Nations (UN).

The rise of judicial rulings such as the one in Los Angeles signals a broader paradigm shift.

Digital platforms have already become a permanent feature of the social fabric; however, their impact on public health is no longer treated as an inevitable side effect, but is increasingly recognized as a central issue of regulation, ethics, and corporate responsibility.

* This article may be republished online under the CC-BY-NC-ND Creative Commons license.
The text must not be edited and the author(s) and source (Science Arena) must be credited.

News

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Receive our newsletter

Newsletter

Receive our content by email. Fill in the information below to subscribe to our newsletter

Captcha obrigatório
Seu e-mail foi cadastrado com sucesso!
Cadastre-se na Newsletter do Science Arena