Axios Pro Exclusive Content

Big Tech CEOs to testify on child exploitation

headshot
Jan 29, 2024
Illustration of a person hanging onto a closing laptop.

Illustration: Shoshana Gordon/Axios

Big Tech CEOs will testify about online sexual child exploitation this week as artificial intelligence intensifies the problem.

Driving the news: The Senate Judiciary Committee meets Wednesday at 10am ET for a hearing featuring Meta's Mark Zuckerberg, X's Linda Yaccarino and Snap's Evan Spiegel.

  • TikTok's Shou Chew and Discord's Jason Citron will also testify.

Why it matters: Bringing prominent CEOs to Capitol Hill draws major media attention and protesters. It also helps keep the momentum going on a tech issue that could easily fade away in a Congress primarily focused on the upcoming election and funding the government.

  • But without Senate Majority Leader Chuck Schumer bringing bills to the floor and the House getting in line, there's little substance a tech CEO spectacle can usually provide.
  • Wednesday will mark the eighth time Zuckerberg has testified before lawmakers since his first appearance in 2018.

AI is injecting a new level of urgency as the technology has made it easier to generate and spread CSAM.

  • Discord, TikTok and Snap told Maria they are developing and building better detection models to address AI-generated CSAM.
  • The companies are part of Lantern, an effort launched in November to establish a procedure for cross-platform collaboration to stop CSAM.
  • Stopping the generation of CSAM on other platforms is key, companies said. OpenAI is part of Lantern.
  • The bigger issue, Snap said, is distinguishing AI-generated images from real ones that may involve harm to a child who might be in immediate need.

The committee late last year sent subpoenas to some social media platforms to get them to come to the Hill. But companies tell a different story.

  • According to some of the companies, the issue wasn't about whether to participate in the hearing, but rather who would be testifying, adding that having trust and safety experts there could better inform the conversation.

State of play: The Kids Online Safety Act, which would require platforms to enable the strongest safety settings by default, will be a prominent talking point during the hearing.

  • Snap made waves last week when it came out as the first tech company to support it.
  • On the STOP CSAM Act, which would expand protections for child victims in federal court, Snap said certain language still needs to be tightened and is being workshopped by Judiciary Chair Dick Durbin's office.
  • Other bills at play include the EARN IT Act, which would remove immunity for online CSAM violations, and the Clean Slate for Kids Online Act, which would give every person the right to demand their personal information from when they were 13 or younger be deleted.
  • COPPA 2.0 and the Cooper-Davis Act are also in the mix.
  • There's long been talk in the Senate of packaging various proposals together, but in the House, the focus is still on comprehensive privacy.

What they're saying: Snap's endorsement of KOSA is increasing pressure on the other platforms to sign on, but the timing is being met with skepticism from advocacy groups.

  • "It's no coincidence that Snap chose to endorse KOSA now, days before their CEO testifies in front of the Senate Judiciary Committee about their consistent failures to protect kids online," Issue One's Alix Fraser said.
  • Discord said it is not ready yet to engage on specific legislation.

Meanwhile, Meta late last week announced it would make messaging settings stricter for teens.

  • The Tech Oversight Project's Sacha Haworth said the Snap and Meta announcements are part of "a broader context — in advance of a congressional oversight hearing after years of harm to children and teens. Parents, young people and advocates are within their right to be more than skeptical of these companies."
  • Common Sense Media's Jim Steyer: "Through this announcement, Meta is yet again sending a message that the burden is on parents to monitor their kids' settings and messages, rather than making substantive changes to their product to protect children en masse."

The other side: Companies said they're making large investments in their trust and safety teams, implementing new tools to help teens navigate risky situations, and collaborating with the National Center for Missing & Exploited Children.

Go deeper