Lawmakers target law protecting Reddit, Google from content liability

Reddit co-founder and CEO Steve Huffman. Photo: Zach Gibson/Getty Images
Lawmakers mulling changes to the law that shields Facebook, Reddit and other online platforms from liability over user-generated content found some bipartisan common ground during a House Energy & Commerce joint subcommittee hearing Wednesday.
Why it matters: Technology companies say changing the law that protects online platforms — Section 230 of the Communications Decency Act — is an existential threat to their business models and the internet itself. But both Republicans and Democrats agreed the platforms are not doing enough to police themselves when it comes to removing harmful content from their sites.
Here are two main areas where the parties seemed to find some level of consensus on Wednesday.
1. Trade agreements: Republicans and Democrats both voiced concerns that the U.S. appears to be including language similar to Section 230 in trade agreements, with Energy & Commerce Committee ranking member Greg Walden (R-Ore.) offering particularly pointed criticism.
- “We’re getting blown off on this, and I’m tired of it,” Walden said after referencing a letter he and committee chairman Frank Pallone (D-N.J.) sent to U.S. Trade Representative Robert Lighthizer. “Clearly they’re not listening to our committee or us.”
2. Repeal is unlikely: Republicans and Democrats also agreed they don’t want to gut or repeal Section 230, but they warned Reddit CEO Steve Huffman and Google global head of intellectual property policy Katherine Oyama that companies need to step up their online content moderation efforts.
- “We all recognize that content moderation online is lacking in a number of ways and that we all need to address this issue better,” said Rep. Mike Doyle (D-Pa.), chairman of the communications subcommittee. “If not you, who are the platforms and experts in this technology, and you put that on our shoulders, you may see a law that you don’t like very much and that has a lot of unintended consequences for the internet.”
Reality check: Despite some agreement among lawmakers that there are problems with online content moderation, there’s a variety of approaches to addressing the issue and immediate action is unlikely.
- One idea, from Boston University law professor Danielle Citron, would revise Section 230 so the protection is conditioned on whether the companies have reasonable content moderation practices.
- But Corynne McSherry, legal director of the Electronic Frontier Foundation, called the reasonable standard “terrifying” because it would have to be determined by courts. “That means, as a practical matter, especially for small businesses, a lot of litigation risk as courts try to figure out what’s reasonable.”