Feb 21, 2023 - Technology

Justices weigh liability protections for online content

People gather in front of the U.S. Supreme Court

People wait in line to listen to oral arguments at the U.S. Supreme Court on Feb. 21. Photo: Drew Angerer/Getty Images

The Supreme Court heard arguments Tuesday in a landmark tech case, Gonzalez v. Google, grappling for the first time with whether to make big changes in a 1996 law protecting service providers and publishers from being sued over content their users post.

Why it matters: The court's decision could have a far-reaching impact on any website that hosts content created by third parties.

  • The court's ruling could also prompt Congress to act, particularly if the justices either decline to decide the case, saying it's up to Congress to set the scope of the law, or send it back to a lower court.

The big picture: Since its inception, Section 230 of the Communications Decency Act has largely shielded websites from liability for what third parties post — user reviews, restaurant recommendations, Facebook status updates and tweets.

  • The Gonzalez case marks the first time the law has been tested before the Supreme Court.

Background: The case stems from the 2015 ISIS Paris terrorists attacks. The family of a victim sued Google-owned YouTube, wanting to hold the platform partly responsible for radicalizing ISIS members.

  • The plaintiffs argue that YouTube should not get Section 230 liability protection for its "recommendations," or the way it sorts and organize videos to show users relevant results they may be interested in, saying that inherently makes them part of creating the content.
  • Google's attorneys argued that sorting is what makes the internet useful and relevant to people and said there's no proof that YouTube was involved in any way in purposely pushing ISIS content to people. Websites "have to make choices" and results "have to be relevant," Google's attorney Lisa Blatt argued.

What they're saying: "This is a pre-algorithm statute," said justice Elena Kagan, who added that "algorithms are endemic to the Internet."

  • Kagan also recognized the court may be ill-equipped to parse the complexities of technology law, saying: "These are not the nine greatest experts on the Internet," but also asking if Section 230's liability protection has gone too far.
  • Those supporting Google's side of the case argue that taking away Section 230 protections from algorithmically-created content would create "economic dislocation," said justice Brett Kavanaugh. "There are serious concerns ... we are not equipped to account for that."

Details: Over more than 2 1/2 hours of oral arguments, the justices appeared to be confused about the plaintiff's arguments, suggested that the conflict might better be fixed by Congress, and debated how far liability protection should go.

  • Justice Clarence Thomas, who has expressed a desire to revisit Section 230, didn't hesitate to point out weaknesses in the plaintiffs' argument. He also floated scenarios of internet service providers becoming liable for defamatory content and asked how to distinguish between the "recommendation" of a piece of content and a full-blown "endorsement."

Between the lines: The justices repeatedly tried to draw a boundary beyond which Section 230 would no longer protect a site from sorting content in a certain way.

  • Both Blatt and the Justice Department lawyer arguing Tuesday said Section 230 does not protect illegal activity — which would include a site, for example, breaking discrimination law by the way it sorted content or asked questions of users.

The intrigue: Conservative Justice Neil Gorsuch brought up content generated by artificial intelligence in a "post-algorithm world," calling into question whether such material should have Section 230 protection.

Go deeper