Seattle’s public school system is unhappy.
The district, tasked with teaching approximately 50,000 students across 100 schools, is confronting what it calls a “mental health crisis.” And it has placed blame for this crisis on four social media companies: Meta, the parent company of Facebook, Instagram, and WhatsApp; ByteDance, the parent company of TikTok; Alphabet, the parent company of YouTube; and Snap, the company behind Snapchat.
So the school district filed a lawsuit (pdf) against the four firms — which are among the most valuable companies in the world with a combined market capitalization of about $2 trillion — for wreaking havoc on its student population.
“From 2009 to 2019, there was an on-average 30 percent increase in the number of students at plaintiff’s schools who reported feeling ‘so sad or hopeless’ almost every day for two weeks or more in a row that [they] stopped doing some usual activities,” lawyers for the school district wrote in a 92-page complaint, filed in the federal district court in Seattle on Jan. 6.
Seattle Public Schools allege that the defendants violated Washington State’s public nuisance law, a vague statute more commonly used for noise complaints than corporate malfeasance.
The lawsuit is a head-scratcher. It’s built on a weak causality, legal experts told Quartz, between the actions of social media companies and the degraded mental health of its students, and based on a claim of harm that would be better brought by affected students or parents rather than a school district. The suit also makes a legal argument about Section 230 of the Communications Decency Act, an important but controversial statue that gives websites some legal liability protection for user-generated content. But that argument could soon be made irrelevant by the US Supreme Court, which is tackling the question of Section 230 in its current term.
Seattle Public Schools says design and algorithms that power the named social media companies are responsible for the suffering of its students.
There are well-established repercussions for using social media. Last year, a Wall Street Journal investigation — spurred by a whistleblower — found that Meta’s own internal research, for example, showed Instagram negatively affected teenage girls’ body image. But in a legal context, experts say it’s difficult to isolate the cause of mental health problems in individual children, let alone the collective thousands who attend a major metropolitan school district. And, of course, social media might harm some students while helping others find community and friends online.
“There’s no doubt that teens are under extraordinary psychological pressure, perhaps unprecedented pressure, but there’s so many reasons why that might be the case,” said Eric Goldman, a professor at Santa Clara University School of Law, who stressed the drastic effects of the pandemic. “It’s hard to isolate what’s causing the stress among Gen Z … because it’s all interrelated.”
Jennifer Granick, an attorney at the American Civil Liberties Union, said in an interview that she was unsure how the school district can prove harm to its entire student body, justify that it’s the right plaintiff, and isolate the causes of the mental health crisis to social media.
“How are they going to show causation — that that crisis is due to social media and not due to any one of the millions of other causes, like stress and poverty and the pandemic?” she asked.
Goldman also took issue with the school district bringing the case. “I am puzzled by the idea that the school district is the right plaintiff,” he said. “Think about all of the things going on in society that are manifesting in the school environment. Can school districts sue for all of those other problems?”
Social media is the latest in a long list of bogeymen parents and officials have blamed over the years for corrupting the country’s youth It’s a list that has included things like heavy metal, video games, and marijuana.
“Can the school districts sue the record labels that publish heavy metal or the game publishers who publish video games or the pot dealers who sell pot?” Goldman asked. “Where does that start and stop? It just makes no sense.”
The lawsuit only seeks one charge: A violation of the state’s public nuisance law. But Goldman said that public nuisance laws typically apply to physical spaces, not digital ones.
“It’s an unusual claim,” he said. “If there’s a drug dealer setting up shop across the street, that would be the kind of thing that might be a public nuisance. Calling software that’s not connected to the school in any way a public nuisance is just weird.”
A spokesperson for Seattle Public Schools did not respond to a request for comment.
The most salient legal claim in the lawsuit is that Section 230 of the Communications Decency Act does not protect the four defendants from legal liability for the design decisions that control their respective algorithms.
Section 230 has become a lightning rod for social media criticism in recent years. The statute, passed into law in 1996, essentially shields the owners of websites that host third-party content from users—such as web forums, comment sections, and social media—from certain liability.
Scholars widely credit Section 230 as intrinsic to the growth and maturation of the modern web: It’s given website owners breathing room to exercise their legal right, protected by the First Amendment, to moderate content as they see fit. And it’s propped up Silicon Valley’s largest companies who depend on user-generated posts, photos, videos, comments, reviews, recommendations, and much more, to grow.
But tech’s critics have searched far and wide for a way to rein in social media companies that they feel have grown too powerful, and are abusing that power. American politicians as ideologically varied as Republican senator Josh Hawley and Democratic senator Amy Klobuchar have introduced bills to amend Section 230. This week, president Joe Biden wrote in the Wall Street Journal that Congress must “fundamentally reform Section 230 of the Communications Decency Act, which protects tech companies from legal responsibility for content posted on their sites.”
In its lawsuit, Seattle Public Schools claims that Meta, Alphabet, Snap, and ByteDance cannot hide behind Section 230 for what their algorithms have chosen to promote, effectively asking a federal court to weigh in on liability for “recommending and promoting harmful content to youth,” as the complaint puts it.
That question is worth asking, but there’s one caveat—it’s already an issue before the Supreme Court. (In a troubling sign for the law, justice Clarence Thomas has previously signaled that he thinks Section 230 needs to be re-evaluated.) Two cases related to Section 230—Gonzalez v. Google and Twitter in Goodbye—have put 230’s protections front-and-center before the Court’s justices this term.
The two cases largely assess whether Section 230 protects Google-owned YouTube as well as Twitter from legal liability under US anti-terrorism laws, with a focus on the role their algorithms play in recommending ISIS recruiting content to users.
Granick said that while algorithms can be controversial, they are inseparable from how social media companies exercise their legal right to moderate content. “An algorithm is just an automated way of putting policies in place,” she said. “That’s how they take down hate speech. That’s how they take down misinformation. That’s how they take down harassment.”
Flaunt Weeekly Music star, Davido receives Cadillac Escalade SUV as birthday gift from US-based Nigerian…
Flaunt Weeekly Tiwa Savage explains why she returned a Range Rover SUV gifted to her…
Flaunt Weeekly 25K Drops Soul-soothing “Something Special” Music Video Featuring Marcus Harvey. Pretoria-based rapper has…
Flaunt Weeekly Kwesta Delivers A Polished ‘Dladla Vilakazi’ Music Video. Award-winning rapper Kwesta has returned…
Flaunt Weeekly By Spooky on November 21st, 2024 Category: News The Zefiro is a clever…
Flaunt Weeekly Wizkid released his 6th studio album earlier today and the initial impression on…