In a lawsuit, a jury found that social media platforms Meta and YouTube were responsible for contributing to the mental health harm of a young woman who used their platforms as a child.

The case centred on claims that platform design features, such as infinite scrolling, autoplay, and constant notifications, encouraged addictive use. Lawyers argued these features played a key role in keeping young users engaged for extended periods, ultimately contributing to harmful outcomes.

Jurors concluded that both companies were negligent and failed to adequately warn users of potential risks, with Meta bearing the greater share of responsibility.

A Shift in Responsibility

Experts say the ruling represents a major turning point. Traditionally, responsibility for managing the risks of social media has largely fallen on individuals and parents. However, this decision recognises that platform design itself can contribute to harm.

New Zealand-based academics and specialists highlighted that the verdict acknowledges social media companies as active participants in shaping user behaviour, not just passive hosts of content.

Why This Case Matters

This case is widely seen as a test for thousands of similar lawsuits currently underway. Because the ruling focuses on product design rather than user-generated content, it may bypass traditional legal protections that have historically shielded tech companies from liability.

If upheld, the decision could open the door for more legal action and force major changes in how social media platforms are built, particularly in relation to features that encourage prolonged use.

Global Ripple Effects

The verdict has already sparked international debate. Governments and regulators are increasingly exploring stricter rules around social media use, especially for children.

In New Zealand, discussions are ongoing about restricting access for younger users, including proposals to limit or ban media use for those under 16.

What Happens Next?

While both companies are expected to appeal the decision, the case is a powerful example. It shows a growing willingness to hold tech companies accountable, not just for what appears on their platforms, but for how those platforms are designed.

The ruling reframes the conversation: from individual responsibility to shared accountability between users, families, and the tech companies.

Related Posts

Attendance improves but absence remains unchanged

Attendance improves but absence remains unchanged

New school attendance data shows a modest improvement in overall attendance rates, but little movement...

Read More
Waitangi Tribunal opens urgent inquiry

Waitangi Tribunal opens urgent inquiry

The Waitangi Tribunal has begun an urgent inquiry into changes to how the Treaty of...

Read More
Concerns over low uptake of SMART

Concerns over low uptake of SMART

An educators’ collective has raised concerns about the uptake of the Government’s new SMART assessment...

Read More
Business Meeting

Want to advertise with us?

Get your brand in front of the most influential decision-makers in New Zealand's education sector.