Smaller firms fear limiting net liability law will hobble them

- Ashley Gold, author ofAxios Pro: Tech Policy

Illustration: Natalie Peeples/Axios
A looming U.S. Supreme Court decision on a key internet liability law could upend the business landscape — not just for giant platforms but for small and mid-size firms that fear a massive upheaval.
Driving the news: Later this month the Supreme Court will hear oral arguments in Gonzalez v. Google, a case with major implications about how tech platforms host and promote content posted by users.
- The case centers on Section 230 of the Communications Decency Act, the 1996 law that says companies aren't liable for content their users post. The court will decide whether the law covers choices firms make about selecting and ranking content by algorithm, including suggestions of related videos, pictures, job listings and recipes.
Why it matters: Google's name is on this case, but other companies are worried an adverse ruling for Big Tech could be ruinous for smaller and medium-sized firms.
Background: In Gonzalez v. Google, relatives of victims of an ISIS attack are suing Google-owned YouTube for allegedly helping turn viewers into terrorists by recommending pro-ISIS content. Justice Clarence Thomas has suggested the court place new limits on the reach of Section 230.
- The plaintiffs argue that Section 230's protection does not extend to algorithmically created recommendations, since YouTube plays a role in deciding which videos to recommend to users.
- Google contends Section 230 protects YouTube's methods of organizing users' posts, and weakening the law would only make it harder to filter out terrorism content.
What they're saying: "Any site that hosts user-generated content should be scared," Josh Ackil, executive director of Internet Works, a coalition of small-and medium-sized tech companies, told Axios.
- Weakening Section 230, Ackil said, will make companies take down more content and unleash a flurry of lawsuits. "Any increased activity in that space... is an undue burden on a smaller company, and something bigger companies will be better equipped to manage," he said.
What's a stake, according to tech firms that aren't as big or rich as Google, Meta and their peers:
- Pinterest: The platform is able to serve up relevant posts to users thanks to Section 230 protection, Braden Cox, head of U.S. public policy, told Axios. "If algorithmically presenting content to users could expose services to potential liability for unlawful content, they naturally would refrain from using those systems to present relevant content, depriving users of the positive experiences we all want and need."
- Indeed: "What you'll see from a lot of companies is an overextension of moderation, likely, because it turns into a proposition of risk tolerance," Matt Jensen, senior director of government relations at the job-listing site, told Axios.
- Patreon: The site for creators to monetize their online followings doesn't rely on algorithmic curation, but algorithms that make creators popular elsewhere on the internet such as those on YouTube or TikTok, lead them to Patreon, Colin Sullivan, the company's vice president of legal, told Axios. "It would be an incredibly bad result if Section 230 was rolled back... especially for creators," he said.
- Wikipedia: The online encyclopedia argues Section 230 protection of algorithms lets it organize and arrange pages, link out to related articles, protect itself from bad-faith lawsuits, and root out spam, per a blog post from the Wikimedia Foundation.
- Yelp: "Recommending content is a core form of publication under Section 230, and necessary to provide consumers with useful, trustworthy reviews," the company's lawyer's write in a brief. "Without immunity, deceptive reviews would flourish and consumers would be harmed."
- Reddit: Per the company's brief: "A sweeping ruling narrowing Section 230 protections risks devastating the internet. It is smaller and start up platforms especially that depend on Section 230 to foster diverse approaches to content moderation and to challenge the dominant industry leaders."
The other side: Proponents of Section 230 revision argue that the law has either let tech firms get away with hosting too much dangerous content or given them too much latitude to take down material they deem objectionable.
- Section 230 has been applied far too broadly historically, a brief filed to the court by University of Miami law professor Mary Anne Franks argues. While Google should not be held liable for the claims made by the plaintiffs in this case, Frank says, "the text, history, and structure of Section 230 do not support the broad unconditional-immunity approach taken by many lower courts."
What's next: Tech firms are making the rounds on Capitol Hill to urge lawmakers to clarify the intent of Section 230 if the Supreme Court weakens it, a number of companies told Axios.
- Oral arguments at the Supreme Court for Gonzalez v. Google are scheduled for February 21.