‘Wave’ of litigation expected as schools fight social media companies
Districts are joining a complaint against Meta, Snapchat, TikTok and YouTube, but some doubt the firms can be blamed for teen mental health struggles.
K-12 Dive | By Kara Arundel | June 1, 2023
About 40 and counting school districts across the country are suing social media companies over claims that their apps are addictive, damaging to students’ mental health, and causing adverse impacts on schools and other government resources.
Many of these lawsuits, which were originally filed in a variety of court jurisdictions, were consolidated into one 281-page multidistrict litigation claim filed March 10 in the U.S. District Court for the Northern District of California. Plaintiffs in the case include school districts, individuals and local and state governments. In total, there are about 235 plaintiffs.
The product liability complaint seeks unspecified monetary damages, as well as injunctive relief ordering each defendant to remedy certain design features on their platforms and provide warnings to youth and parents that its products are “addictive and pose a clear and present danger to unsuspecting minors.”
Attorneys representing plaintiff school districts said this master complaint allows districts to share legal resources for similar public nuisance claims against social media companies in an attempt to recoup money spent addressing the youth mental health crisis.
Individual district lawsuits describe actions taken by school systems to address student mental well-being, such as hiring more counselors, using universal screeners and provding lessons on resilency building. In its lawsuit, California’s San Mateo County Board of Education also explains how it had to reallocate funding to pay staff to address bullying and fighting, hire more security staff, and to investigate vandalism.
Schools are on the front lines of this crisis, said Lexi Hazam, an attorney with Lieff, Cabraser, Heimann & Bernstein and co-lead counsel for the plaintiffs’ consolidated complaint.
Districts “are often having to divert resources and time and effort from their educational mission in order to address the mental health crisis among their students,” said Hazam. Students’ mental health struggles are caused largely by social media design features that “deliberately set out to addict” youth, she said.
The design features, the multidistrict litigation said, “manipulate dopamine delivery to intensify use” and use “trophies” to reward extreme usage.
School districts “are often having to divert resources and time and effort from their educational mission in order to address the mental health crisis among their students.”
Lexi Hazam
Co-lead counsel for the plaintiffs’ consolidated complaint
But major litigation like this is likely to take many years to resolve, according to legal experts. The lawsuit is in its early stages, and the court will soon consider motions to dismiss. If the case proceeds, it will move into the discovery phase, where opposing parties can request documents and information that may not already be available.
One legal expert said getting involved in the case may actually make school districts vulnerable to legal action by parents who cast blame on them for not doing more to support students’ mental well-being. The case also discounts the positive aspects of teens’ social media use, said Eric Goldman, law professor and co-director of the High Tech Law Institute at Santa Clara University School of Law.
“Here’s the reason why not every school district is going to sign up — first, because I think at least some school districts realize that social media may not be the problem. In fact, it may be part of the solution,” Goldman said.
The more likely reason why districts shouldn’t participate, Goldman said, is because schools would be “admitting to their parents that they aren’t doing a good job to manage the mental health needs of their student population.”
Reducing risks
The lawsuit — known as the Social Media Adolescent Addiction/Personal Injury Products Liability Litigation — was filed against Meta Platforms Inc., which operates Facebook and Instagram, as well as the companies behind Snapchat, TikTok and YouTube.
There’s no cost to school systems to join the litigation since the plaintiffs’ law firms are working on contingency, meaning they’re paid only if they prevail, according to several plaintiffs attorneys.
Per the lawsuit, the social media platforms exploit children by having “an algorithmically-generated, endless feed to keep users scrolling.”
The result, the complaint said, is that youth are struggling with anxiety, depression, addiction, eating disorders, self-harm and suicide risk. Individual school district cases folded into this litigation also claim the social media companies’ platforms have contributed to school security threats and vandalism.
“Defendants’ choices have generated extraordinary corporate profits — and yielded immense tragedy,” the master complaint declares.
“Here’s the reason why not every school district is going to sign up — first, because I think at least some school districts realize that social media may not be the problem. In fact, it may be part of the solution.”
Eric Goldman
Law professor and co-director of the High Tech Law Institute at Santa Clara University School of Law
The lawsuit notes the widespread use of social media among teens, as well as details troubling statistics showing increases in youth suicide risk, anxiety and persistent sadness.
In response to a request for an interview or statement, Meta Head of Safety Antigone Davis, emailed, “We want to reassure every parent that we have their interests at heart in the work we’re doing to provide teens with safe, supportive experiences online.”
The other defendant companies did not respond to requests for interviews or statements.
Davis’ email said Meta has developed more than 30 tools to support teens and their families, including ones that verify age, allow parents to decide when and for how long their teens use Instagram, automatically sets new Instagram accounts to private for those under 16, and send notifications encouraging teens to take regular breaks.
Meta has also invested in technology that finds and removes content related to suicide, self-injury or eating disorders before it is reported by users. On the company’s Safety Center webpage, it states that it has never allowed people to celebrate or promote self-harm or suicide. Meta also removes fictional depictions of suicide and self-harm, as well as content that shows methods or materials.
“We do, however, allow people to discuss suicide and self-injury because we want Facebook and Instagram to be places where people can share their experiences, raise awareness about these issues, and seek support from one another,” the webpage says.
Davis said, “These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.”