From Likes to Lawsuits: Social Media’s Legal Challenges

  • Market Insight 18 December 2024 18 December 2024
  • North America, UK & Europe

  • Technology risk

Social media platforms have to date attracted over 5.22 billion users worldwide. Despite increasing concerns about the impact of social media on young people, underscored by cases like Molly Russell’s, children and adolescents continue to engage with these platforms extensively.

In November 2017, London teenager Molly Russell took her own life after seeing content about suicide, self-injury, and depression on Instagram and other social media platforms. As part of an inquiry into her death in the UK, the coroner stated that harmful online content, influenced by social media algorithms, contributed to the 14-year-old’s death.[2]

Molly’s tragic death has also been cited in litigation in the US, with social media companies facing allegations of personal injury, negligence, public nuisance, and violations of online privacy laws designed to protect children’s data.

With legislatures and judiciaries worldwide working to improve online safety for children, Australia has taken a bold step by banning social media for individuals under 16 years old, with potential fines of up to $32.5 million / £25.7 million for non-compliance by technology companies. Though Australia’s parliament has approved the law, the ban will not take effect for at least 12 months.[3] Concurrently, heightened regulatory scrutiny of social media companies has also resulted in substantial fines for social media companies in the US[4], with the UK poised to follow suit with the introduction of regulatory fines under the Online Safety Act, set to come into force next year.

US Litigation

On 6 October 2022, 28 separate actions against various defendants, including Meta Platforms, Inc., Instagram LLC, Snap, Inc., TikTok, Inc., ByteDance, Inc., YouTube LLC, Google LLC, and Alphabet Inc., were consolidated into a federal Multi-District Litigation (“MDL”) in the Northern District of California.[5] These actions allege that the defendants failed to warn underage users and their parents about the risks of addiction and the potential for several injuries, including suicidal ideation, self-harm, eating disorders, anxiety, and depression. This conduct is claimed to result in various emotional and physical harms, including death.

The MDL has since grown to encompass hundreds of actions brought by personal injury plaintiffs, school districts, local government entities, and state attorneys-general, all addressing the allegedly addictive designs of social media.

The claims can be categorised into three main groups:

       i. Claims by state attorneys-general;

      ii. Public nuisance claims; and

     iii. Personal injury claims.

i. Claims by state attorneys-general

These claims, filed by multiple state attorneys-general, are directed solely at Meta. The allegations include that Meta falsely represents its platforms as safe for children and misrepresents the prevalence of harmful content. Additionally, the defendants are accused of violating the Children's Online Privacy Protection Act 1998 (“COPPA”) by collecting personal information from children under 13 without parental consent. Plaintiffs argue that this issue is exacerbated by the company’s ineffective age-gating protocols.

The plaintiffs also contend that the addictive design features of these platforms, particularly those targeting young users, such as infinite scroll and autoplay, cause serious harm by promoting addictive use. They allege that mental health issues, including anxiety, depression, self-harm, and eating disorders, are linked to children's use of the defendants’ platforms.

ii. Public nuisance claims

These claims, filed in December 2023 by school districts, counties, district attorneys, and municipalities, allege that defendants (Meta, Snap, TikTok and Google) design their platforms to be addictive to children, particularly by using automated recommendation systems. In their complaint, the plaintiffs argue that children are uniquely susceptible to harm from these platforms due to the ongoing development of their prefrontal cortex. Further, it is argued that the defendants' actions cause an unreasonable interference with rights common to the general public and unreasonably interfere with the health, safety, peace, comfort, or convenience of the community, amounting to public nuisance.

As with all three categories of claims in the MDL, the plaintiffs also allege that the defendants’ design choices have contributed to a children's mental health crisis.

iii. Personal injury claims

These claims, also filed in December 2023, were filed on behalf of children (and the representatives of their estates, in cases of death) who are alleged to have suffered personal injuries as a result of defendants’ (Meta, Snap, TikTok and Google) products.

The allegations are largely similar to those in the public nuisance claims and those filed by state attorneys-general. Defendants are accused of deliberately designing their platforms to be addictive to children, thereby contributing to a children's mental health crisis. In their complaint, plaintiffs cite the case of Molly Russell’s death in the UK and allege that despite this tragic incident, Meta “did nothing to stop harm to its young users.

Wider implications

With the first bellwether trials in the MDL set for October 2025, insurers have already begun pre-empting the issue of coverage, by arguing that the underlying claims in the MDL do not allege damage “because of” bodily injury or personal and advertising injury[6]. It has been argued by insurers that in respect of the public nuisance claims, the defendants seek damages for “general, non-derivative economic losses that are not based on allegations that plaintiffs treated (or paid for treatment of) injuries to any specific individual”.

We note that insurers successfully raised similar arguments when seeking to avoid defending underlying opioid litigation[7], perhaps indicating that insurers are well positioned even if liability is established against defendant social media companies in the MDL public nuisance claims. It seems unlikely, however, that insurers will have such success if liability is established against defendants in the personal injury claims.

In such cases, it will be interesting to see how a court allocates liability among defendants, especially when the allegations against them are not identical. For example, the plaintiffs in the personal injury claims allege that Snap and TikTok  particularly encourage dangerous challenges that pose serious risks to physical safety. For instance, we have seen this with the “Blackout Challenge,” where young people are encouraged to make themselves faint by holding their breath.

The evolving regulatory landscape should also be considered. In 2019, the Federal Trade Commission (“FTC”) handed Facebook a fine of $5 billion, the largest penalty imposed on a company for violating consumers’ privacy rights[8]. With the Online Safety Act set to come into force next year in the UK, insurers should also be aware that under section 9 of the Act, in-scope companies[9] will be required to carry out an illegal content risk assessment in respect of their platforms, in the form prescribed under Schedule 3 to the Act. Section 27 of the Act will further require the removal of illegal content and under section 29, platforms that “are likely to be accessed by children” must also minimise the risk of content that is legal but harmful to children (as defined under section 60 of the Act). Failure to do so can result in fines of the greater of up to £18 million, or 10% of worldwide revenue.

Developing Technology

As technology advances, legal complexities increase. Beyond the MDL, there have been cases such as a lawsuit against Google, where a woman claims her 14-year-old son committed suicide after becoming addicted to a Large Language Model (“LLM”) developed by Character.AI.  Google is alleged to have co-created this AI startup, with its founders being former Google employees that the plaintiff alleges were instrumental in developing Google's AI technology. While Google disputes this claim, the broader implications are worth considering.

With the rise of LLMs like OpenAI’s GPT-4 and Microsoft’s Copilot, the scope for further potential litigation could extend far beyond social media companies. Insurers will likely be most interested in whether courts find social media companies liable for personal injuries, as this could expand the scope of liability to other tech companies, including those developing AI, gaming platforms, and other digital services that engage young users.

Insurers should remain cognisant that successful personal injury claims against social media companies could establish new precedent for proving harm caused by digital platforms. This includes demonstrating the link between platform design and user harm, which could be applied in future cases involving other technologies. Insurers may need to develop new strategies to manage the heightened risk associated with digital platforms, ensuring they are prepared for the potential financial and legal impacts of these emerging technologies.

 

[2] Regulation 28 report to prevent future deaths (Molly Russell - Prevention of future deaths report - 2022-0315).

[5] In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (4:22-md-03047).

[6] Hartford Casualty Insurance Co. et al. v. Instagram LLC et al., (N24C-11-010) (Delaware Superior Court).

[7]  Ace American Insurance Co. et al. v. Rite Aid Corp. et al., (339, 2020) (Delaware Supreme Court).

[9] Companies operating platforms where: (i) people can create and share content or interact with each other; (ii) people can search other websites (including search services); and (iii) businesses publishing or displaying adult content.

End

Additional authors:

Fayash Butt, Trainee Solicitor

Stay up to date with Clyde & Co

Sign up to receive email updates straight to your inbox!