Axios Pro Exclusive Content

Senators target AI-generated child sexual abuse content

headshot
Jul 26, 2023
One person seated talks to another person standing during a congressional hearing

Ossoff speaks with Blackburn on May 11. Photo: Drew Angerer/Getty Images

A bipartisan Senate duo is pushing the Justice Department to do more to combat child sexual abuse material online that is generated by artificial intelligence.

What's happening: Sens. Jon Ossoff and Marsha Blackburn are urging Attorney General Merrick Garland to boost resources for prosecuting cases involving such content and have the department work with Congress to develop strategies that protect children from exploitation.

Why it matters: Predators' ability to create and spread fake content using AI is making it even more challenging for law enforcement officials already struggling to combat the abuse minors face online.

What they're saying: In a letter shared with Axios, the senators pinpointed diffusion AI models, which allow users to create realistic images based on a brief description of what they want to see.

  • That tool enables "pedophiles to create AI-generated CSAM in seconds" and raises "serious ethical and legal questions about the exploitation of minors through the production and dissemination of such material," the senators write.
  • "While these images may not depict real children, they impede law enforcement efforts to identify real-life child victims."
  • The senators said a "comprehensive approach" involving law enforcement, tech companies and legislative action is needed to address the proliferation of the content.

What's next: The senators gave the DOJ until Aug. 7 to answer a series of questions, including how many AI-generated CSAM investigations have been launched in the last two years and, of those, how many have resulted in arrests or prosecutions.

Of note: Blackburn and Ossoff earlier this year introduced the REPORT Act, which would strengthen reporting to the CyberTipline related to online sexual exploitation of children.

Go deeper