A first-of-its-kind lawsuit in the US has found Meta and YouTube responsible for mental health harm to a young woman.
Her lawyers linked the design features of their platforms, such as ‘infinite scroll’, to her social media addiction as a child.
Jurors have decided the companies were negligent and acted with malice – an outcome that could affect thousands of similar lawsuits against social media companies.
The SMC has gathered comments from NZ experts in digital communication, law, wellbeing, and psychology.
Dr Melissa Gould, Senior Lecturer – Critical Media Studies, AUT, comments:
“This decision marks a significant shift in how we understand young people’s relationship with technology, by recognising the responsibility of platforms for the harms social media can cause. Until now, responsibility for managing these harms has largely been placed on young people, and their parents. However, this judgement acknowledges that Meta and YouTube have designed addictive products. This shifts the focus away from individual behaviour, such as how young people use their phones, to the wider systems, organisations, and design choices behind these platforms. In doing so, it recognises that social media is not neutral, but intentionally built to shape how it is used.
“This decision also highlights that simple, one-size-fits-all solutions are unlikely to address the complexity of social media harms. Instead, it strengthens the case for a ‘safety by design’ approach, where platforms are required to consider the wellbeing of users in how their technologies are developed.”
Conflict of interest statement: “No conflict of interest.”
Dr Alex Beattie, Senior Lecturer in Media and Communication, Victoria University of Wellington, comments:
“This verdict is a landmark moment. For years, the harms of social media have been treated as a matter of individual responsibility: parents told to monitor, young people told to log off, and policymakers flirting with outright bans rather than changing how platforms are allowed to operate. This ruling flips that logic. It pushes us towards treating addictive platform design as a product safety issue, not a parenting failure.
“Platforms like Instagram and YouTube are engineered to keep us hooked. Their algorithms learn what captures our attention and feed it back to us: faster, louder, more emotionally charged. Features like infinite scroll, autoplay, and push notifications aren’t neutral; they are business decisions that maximise time-on-platform and, in doing so, can undermine young people’s autonomy over how they spend their time.
“By holding Meta and Google liable, the ruling sends a clear message: you cannot design for addiction, profit from it, and then blame users’ lack of self-control when things go wrong. Responsibility has to sit with those who build and profit from the system, not with individuals; especially not with adolescents.
“This is a win for anyone who has ever found themselves doomscrolling at midnight or unable to stop swiping, even when they want to. It’s an early but important sign that the law is beginning to catch up with the realities of our attention-driven digital economy.”
Conflict of interest statement: “Dr Alex Beattie is a recipient of a Marsden Fast Start from the Royal Society of New Zealand in 2024 for a project called Saving Screen Time: How People with ADHD Disconnect from the Internet.”
Rachel Tan, Lecturer in Law (Cyber Law), University of Waikato, comments:
“This decision is significant because it moves beyond treating social media platforms as neutral hosts of user content, and instead recognises that platform design itself can create and amplify harm.
“Features such as infinite scroll, autoplay, and algorithmic recommendations are not passive. They are designed to maximise user attention, and in doing so can expose children to prolonged and sometimes harmful content in ways that are difficult to avoid.
“What this case highlights is that harm can arise not just from what children see, but from how long, how often, and how intensely they are exposed to it. Importantly, the jury’s finding focuses on system design as a substantial factor in harm. This reflects a broader legal shift, where attention is moving away from individual user behaviour and towards the responsibility of platforms for the environments they create.
“For regulators, this raises pressing questions about whether existing laws are equipped to deal with these risks, and whether stronger obligations, such as a duty of care or safer design requirements, are needed to better protect younger users.”
Conflict of interest statement: “I have no conflicts of interest to declare.”
Dr Cassandra Mudgway, Senior Lecturer Above the Bar, Faculty of Law, University of Canterbury, comments:
“This is a significant moment. A US jury has found Meta and YouTube liable not for content on their platforms, but for their design.
“The case sidesteps platform immunity by focusing on features such as infinite scroll, autoplay, and notification systems i.e. tools alleged to drive compulsive use. A finding that these design choices can amount to negligence, and even ‘malice,’ opens a new legal pathway: treating platform architecture itself as harmful.
“For New Zealand, it is time for reflection. Our current framework, particularly the Harmful Digital Communications Act 2015, focuses heavily on content and individual acts of harm. It does not meaningfully address platform design or systemic risk. This case highlights that gap.
“It also strengthens the argument for a regulatory shift already emerging internationally, namely, away from reactive, complaint-based systems, and toward duties on platforms to proactively assess and mitigate risks, especially for young people. It is less about taking down harmful posts, and more about redesigning systems that purposefully amplify harm.
“However, caution is needed. This is a jury verdict in a specific legal context, not a binding precedent here. New Zealand courts would face different statutory settings, and proving causation between platform use and harm remains complex.
“Even so, the broader signal is clear enough, in my opinion. Overseas courts are increasingly willing to scrutinise the business models underpinning social media, not just the speech they host. For New Zealand, the question is whether regulation will keep pace or continue to treat harm as an individual problem, rather than a structural one.”
Conflict of interest statement: “I do not have any conflicts.”
Associate Professor Stephanié Rossouw, School of Social Sciences & Humanities, AUT, comments:
“This landmark verdict shows the law is starting to catch up with what the 2026 World Happiness Report’s new social-media work has been warning about: for many young people, heavier social media use is associated with lower well-being, including more depression, anxiety, stress and poorer body image, especially on visual platforms.
“The key lesson is that this is not about saying all online connection is harmful. The evidence is more nuanced: active, communicative use can be less harmful or sometimes beneficial, but heavy, passive or addictive use is where the risks to youth well-being become much sharper. It really is time for the New Zealand government to listen and act accordingly.”
Conflict of interest statement: “The World Happiness Report 2026 and its accompanying media release can be found at this link. Associate Professor Stephanié Rossouw was not involved in data collection or processing for the report, however she was a co-author of one of the chapters, Social media use and wellbeing in the Middle East and North Africa.”
Dr Ryan San Diego, Registered Psychologist, Registered Addictions Clinician, and Lecturer at the University of Auckland, comments:
“Why social media is like addiction: the current design and features of social media are equipped with similar reinforcements that triggers behaviour responses – these may simply be through notifications and fast buzzing messages designed to get attention from kids and teenagers. Coupled with the desire to socialise, or to try to be part of the group, young people develop habitual checking behaviours as a result, because it reinforces a sense of belonging. To a greater extent that this has resulted in being distracted from the non-virtual world (engaging in sports, helping in household chores, socialising with family members, or focusing on their natural talents and abilities), the process itself can be considered habit forming – addictive.”
“What is the impact when it becomes a habit? There are studies to support significant prolonged virtual engagement with this process may result in brain changes specific to teenagers and may have devastating impact on their emotional, motivational and relational aspects of life; and this has significant impact not just on the individual, but to a greater extent as a public health concern where multiple teenagers are now struggling from being exposed to toxic virtual environments, cyberbullying, racial discrimination, harassment, body image issues, poor sense of self, exposures to mis-information and other internalising problems such as anxiety and depression.”
“How can we prevent it: the issue can be considered a modifiable problem, however, we need to consider effective harm reduction strategies such as the introduction of blocker apps, time out apps, and online family membership for interconnected technologies at home. There are also several advocacies regarding timely disconnection from devices and the promotion of critical conversation and awareness regarding human-technology interaction.”
Conflict of interest statement: “No conflict of interest.”
Professor Kirsty Ross, School of Psychology, Massey University, comments:
“From a psychological perspective, the term “addiction” in this case is being used in a behavioural rather than strictly clinical sense. It does not necessarily refer to a formally diagnosable disorder in the way substance dependence is defined in manuals like the DSM-5, but instead describes a pattern of compulsive engagement that persists despite harm.
“What is central here is not simply high use, but loss of control. An individual continues to engage with the platform beyond their intentions, finds it difficult to disengage, and experiences the activity as increasingly dominant in their daily life. Over time, this pattern can begin to show features commonly associated with behavioural addictions: a growing preoccupation with the activity, escalating use to achieve the same psychological effect, distress when unable to access it, and continued engagement despite clear negative consequences for mood, sleep, or functioning.
“In the context of this case, the psychological argument extends beyond the individual and considers the interaction between human vulnerability and platform design. Social media platforms are structured around reinforcement principles well established in behavioural science. Features such as infinite scrolling, algorithmically curated content, and intermittent social rewards—likes, comments, notifications—operate on what is known as an intermittent or variable reward schedule, which is particularly effective at sustaining behaviour. This is the same underlying mechanism that makes gambling systems difficult to disengage from for some people.
“Importantly, these processes and mechanisms are interacting with developmental factors. Adolescents, in particular, are more sensitive to social feedback, reward, and peer evaluation, while still developing the executive functioning and cognitive processes required to regulate behaviour. This creates a context in which repeated engagement is not simply a matter of choice, but is psychologically reinforced at multiple levels—neurological, emotional, and social.
“There is ongoing debate within psychology about whether this constitutes a true “addiction” in the clinical sense, or whether it is better understood as problematic or dysregulated use. Regardless, the functional outcome is the same: the behaviour becomes difficult to control and is associated with measurable harm.
“What is significant in this case is the shift in framing. Rather than locating the problem solely within the individual—lack of discipline or poor self-regulation—the focus moves toward the platforms themselves and how they are designed and constructed. The argument is that these systems are intentionally designed to maximise engagement by leveraging predictable aspects of human psychology. In that sense, the “addiction” is not simply about the user, but emerges from the relationship between the user and a highly optimised (and intentional) digital environment. The problem is the problem, rather than simply the individual being considered to ‘have’ the problem.
“This case points to “addiction” reflecting a pattern of compulsive, reinforced, and difficult-to-regulate behaviour, shaped by both individual vulnerability and platform design, and maintained despite negative psychological consequences. It highlights that we need to consider the sociocultural influences on individual behaviour and take a whole of problem approach, rather than an individualistic one.”
Conflict of interest statement: Not yet received.
