Supreme Court allows Reddit mods to anonymously defend Section 230

Supreme Court allows Reddit mods to anonymously defend Section 230

Over the past few days, dozens of tech companies have filed briefs in support of Google in a Supreme Court case that tests online platforms’ liability for recommending content. Obvious stakeholders like Meta and Twitter, alongside popular platforms like Craigslist, Etsy, Wikipedia, Roblox, and Tripadvisor, urged the court to uphold Section 230 immunity in the case or risk muddying the paths users rely on to connect with each other and discover information online.

Out of all these briefs, however, Reddit’s was perhaps the most persuasive. The platform argued on behalf of everyday Internet users, whom it claims could be buried in “frivolous” lawsuits for frequenting Reddit, if Section 230 is weakened by the court. Unlike other companies that hire content moderators, the content that Reddit displays is “primarily driven by humans—not by centralized algorithms.” Because of this, Reddit’s brief paints a picture of trolls suing not major social media companies, but individuals who get no compensation for their work recommending content in communities. That legal threat extends to both volunteer content moderators, Reddit argued, as well as more casual users who collect Reddit “karma” by upvoting and downvoting posts to help surface the most engaging content in their communities.

“Section 230 of the Communications Decency Act famously protects Internet platforms from liability, yet what’s missing from the discussion is that it crucially protects Internet users—everyday people—when they participate in moderation like removing unwanted content from their communities, or users upvoting and downvoting posts,” a Reddit spokesperson told Ars.

Reddit argues in the brief that such frivolous lawsuits have been lobbed against Reddit users and the company in the past, and Section 230 protections historically have consistently allowed Reddit users to “quickly and inexpensively” avoid litigation.

The Google case wasraised by the familyof a woman killed in a Paris bistro during a 2015 ISIS terrorist attack, Nohemi Gonzalez. Because ISIS allegedly relied on YouTube to recruit before this attack, the family sued to hold Google liable for allegedly aiding and abetting terrorists.

A Google spokesperson linked Ars to astatementsaying, “A decision undermining Section 230 would make websites either remove potentially controversial material or shut their eyes to objectionable content to avoid knowledge of it. You would be left with a forced choice between overly curated mainstream sites or fringe sites flooded with objectionable content.”

Eric Schnapper, a lawyer representing the Gonzalez family, told Ars that the question before the Supreme Court “only applies to companies, like Reddit itself, not to individuals. This decision would not change anything with regard to moderators.”

“The issue of recommendations arises in this case because the complaint alleges the defendants were recommending ISIS terrorist recruiting videos, which under certain circumstances could give rise to liability under the Anti-Terrorist Act,” Schnapper told Ars, noting that the question of that liability is the subject of another SCOTUS caseinvolving Twitter, Meta, and Google.

Read More

Total
0
Shares