Oct 6, 2022 - Technology

Supreme Court's liability case could scramble the online world

Illustration of a silhouetted scales of justice, with the plates moving up and down.

Illustration: Brendan Lynch/Axios

The Supreme Court's Monday announcement that it would rule on a pair of challenges to a foundational law governing online speech set off internet experts' earthquake alerts.

Why it matters: A decision by the court to alter or strike down the law, Section 230 of the 1996 Communications Decency Act, would rock the legal landscape for every company or organization whose work involves contributions from users — not just social networks but online marketplaces, review sites, neighborhood groups and more.

Driving the news: The Supreme Court's current session, which began Monday, will take up two cases that mark the court's first broad review of social media companies' immunity from lawsuits over moderation practices and content posted by users.

  • In the case of Gonzalez v. Google, the court will look at whether the law protects internet platforms when algorithms target users with recommended content.
  • Twitter, Inc. vs. Taamneh seeks a ruling on whether platforms can be held to violate anti-terrorism laws if they have policies against pro-terrorist content but fail to remove all such messages.

The big picture: Congress has considered many laws modifying Section 230, but it has only passed one narrow 2018 carveout intended to curb online sex trafficking.

  • Critics of the law now hope the court's right-wing supermajority will accomplish what the legislature has not.

Yes, but: The hunger to punish platforms for "censorship" could end up harming free speech online.

  • "If the Court were to substantially narrow Section 230 in a way that made online services potentially liable for third-party content, then that might result in significantly less ability for people to speak freely online," Samir Jain, director of policy at the Center for Democracy & Technology, told Axios.
  • "Section 230 really has been a critical piece of the flowing of free speech on the Internet, particularly through social media and other online services," said Jain, who was one of the litigators in a 1997 case, Zeran v. America Online, Inc., which found Section 230 gave Internet providers broad immunity from lawsuits.
  • "The court could easily take this, and then rule in ways that affect big questions not actually raised by the case," Daphne Keller, platform regulation director at the Stanford Cyber Policy Center, told Axios. "It could mean news feeds get purged of anything that creates fear of legal risk, so they become super sanitized."

The intrigue: The idea that Section 230 needs to be re-visited or scrapped altogether for the digital age has supporters on both sides of the aisle.

  • Democrats tend to believe platforms should do more to limit the proliferation of dangerous content online. Republicans believe they are unfairly censoring conservative speech online.
  • President Joe Biden has floated amending the law, but hasn't made it a priority.
  • Justice Clarence Thomas said in 2020 that "in an appropriate case, we should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms."
  • "It's time that social media platforms are held accountable for the content it recommends to kids and families," tweeted Jim Steyer, CEO of Common Sense Media, who's been pressing the Biden administration to reel in Big Tech. "Until SCOTUS rules on the scope of Section 230, Big Tech will continue to act with impunity."

The catch: Changing Section 230 has bipartisan support, but every change bears massive potential risks.

  • Experts say that a court decision finding that platforms can be sued for how they filter, amplify or make content recommendations could result in varying outcomes, with some companies leaving more harmful content up while others take down more material than necessary.
  • “Allowing suits to go forward for the behavior challenged in this case wouldn’t automatically make the platforms liable," Matt Wood, president and general counsel of Free Press, said in a press statement. "It would merely allow plaintiffs to proceed on the difficult path of proving in court that a platform knowingly provided substantial assistance to a terrorist organization."
  • But the prospect of suits could have a "chilling effect," he said. Such effects would land even harder on smaller businesses that lack extensive legal resources and budgets, per an analyst note from New Street Research.

Between the lines: The Gonzalez case specifically asks whether federal law protects a platform when its algorithm targets a user with recommend content, with Gonzalez alleging that Google aided ISIS with recruitment via YouTube.

  • Whatever the court rules, the implications will go far beyond recommendation algorithms.
  • "You can't parse out amplification from other trust and safety operations," Matthew Schruers, president of the Computer & Communications Industry Association, which represents Big Tech companies, told Axios.
  • Neither Twitter nor Google would comment on the cases.

What's next: The Supreme Court will hear arguments this term, with a decision likely by next summer. 

Go deeper