In a lawsuit, a jury found that social media platforms Meta and YouTube were responsible for contributing to the mental health harm of a young woman who used their platforms as a child.

The case centred on claims that platform design features, such as infinite scrolling, autoplay, and constant notifications, encouraged addictive use. Lawyers argued these features played a key role in keeping young users engaged for extended periods, ultimately contributing to harmful outcomes.

Jurors concluded that both companies were negligent and failed to adequately warn users of potential risks, with Meta bearing the greater share of responsibility.

A Shift in Responsibility

Experts say the ruling represents a major turning point. Traditionally, responsibility for managing the risks of social media has largely fallen on individuals and parents. However, this decision recognises that platform design itself can contribute to harm.

New Zealand-based academics and specialists highlighted that the verdict acknowledges social media companies as active participants in shaping user behaviour, not just passive hosts of content.

Why This Case Matters

This case is widely seen as a test for thousands of similar lawsuits currently underway. Because the ruling focuses on product design rather than user-generated content, it may bypass traditional legal protections that have historically shielded tech companies from liability.

If upheld, the decision could open the door for more legal action and force major changes in how social media platforms are built, particularly in relation to features that encourage prolonged use.

Global Ripple Effects

The verdict has already sparked international debate. Governments and regulators are increasingly exploring stricter rules around social media use, especially for children.

In New Zealand, discussions are ongoing about restricting access for younger users, including proposals to limit or ban media use for those under 16.

What Happens Next?

While both companies are expected to appeal the decision, the case is a powerful example. It shows a growing willingness to hold tech companies accountable, not just for what appears on their platforms, but for how those platforms are designed.

The ruling reframes the conversation: from individual responsibility to shared accountability between users, families, and the tech companies.

Related Posts

Schools brace for rising diesel heating costs

Schools brace for rising diesel heating costs

Schools that rely on diesel boilers are expecting their budgets to come under pressure when...

Read More
Free period products now available in most NZ schools

Free period products now available in most NZ schools

About 2343 schools, around 95% of those eligible, have opted into the Government scheme, meaning...

Read More
Calls for free public transport for young people

Calls for free public transport for young people

The letter urges immediate action to provide free travel on buses, trains, and ferries, particularly...

Read More
Business Meeting

Want to advertise with us?

Get your brand in front of the most influential decision-makers in New Zealand's education sector.