Lawyers for social media companies will be working overtime in the coming weeks as several major trials get underway addressing the potential harms to children caused by popular sites and apps.
At the same time, efforts to deflect at least one major future case have fallen short, increasing pressure on tech giants to agree to an independent assessment of how they protect teen users. The convergence of these developments creates a potential perfect storm for the industry, one that could result in both financial damages and changes to the algorithms that encourage users to keep scrolling for longer and longer periods of time.
Much of the focus is on a bellwether trial in Los Angeles that seeks to hold Meta and Google responsible for harms suffered by children who use their products. Plaintiffs allege that services like Instagram and YouTube are designed to keep users, especially kids, engaged. Opening statements were held Monday, with the plaintiffs’ lawyer arguing that Meta and Google have “engineered addiction in children’s brains.” The case is widely seen as a test for future lawsuits with similar claims, of which there are approximately 1,500.
Meta and Google deny the charges. TikTok and Snap were also named as defendants but settled before the case went to trial.
As that suit began in Los Angeles, opening arguments were also heard in Santa Fe in a case brought against Meta by New Mexico Attorney General Raul Torrez in December 2023. The lawsuit accuses the company’s platforms of being a breeding ground for sexual predators, a claim Meta denies.
That trial, expected to last seven weeks, will determine whether Meta violated the state’s consumer protection laws. “If we can win in this action and force them to make their product safer in this state, it changes the narrative completely about what they say is possible for everyone else,” Torrez said.
Meanwhile, a judge in the U.S. District Court for the Northern District of California rejected a request by Meta, Google, Snap, and TikTok for summary judgment in a case brought by Kentucky’s Breathitt County School District. That case is part of a consolidated multidistrict litigation that seeks to hold social media companies accountable for engineering addictive features that negatively affect student mental health.
Section 230
At the heart of all these cases is how far courts are willing to extend the protections granted by Section 230, the federal law that shields social media companies from liability over content posted by users. The Los Angeles trial, along with the upcoming case in Northern California, argues that jurors should be able to consider whether the algorithms used by these companies are responsible for mental health harms, rather than focusing solely on the content shown on users’ screens.
Perhaps as a preemptive measure, TikTok, Snap, and Meta have agreed to undergo a series of tests overseen by the National Council for Suicide Prevention to evaluate how effectively they protect the mental health of teen users.
Among the issues that will be examined are whether the platforms force users to take a break and if they offer a way to turn off endless scrolling. Companies that perform well will receive a badge signaling that they offer a pathway to mental health support.
Potential ramifications
This is hardly the first time that social media companies have been taken to court over mental health claims. To date, none of those cases has resulted in any sort of major overhauls, however. At the same time, efforts in Washington and by state governments to regulate the industry have fallen short. Further complicating matters is a lack of consensus in the scientific community on whether social media is harmful for teens and kids on the whole.
Still, successful outcomes in these cases could force companies to change how people interact with their platforms, potentially reshaping the social media landscape. Victories for plaintiffs could also expose companies to significant liability payouts for harms linked to their services.