Attorneys general: Congress must bar AI-generated child sexual abuse content
There's a growing bipartisan push to bar the use of artificial intelligence for the purpose of generating child sexual abuse content.
Driving the news: Attorneys general in all 50 states and four U.S. territories sent a letter to Congress this week calling for an explicit prohibition against AI-generated child sexual abuse material (CSAM).
- The prosecutors are calling on Congress to create a working group "devoted specifically to the protection of children from AI" and expand restrictions already in place to account for AI-generated CSAM.
- The technology can easily create deepfake videos by "overlaying the face of one person on the body of another," the letter stated.
- Deepfakes could hurt previously unharmed children by depicting materials that swap their faces onto the faces of children who were abused, the letter stated.
What they're saying: "We need to make sure children aren't harmed as this technology becomes more widespread, and when Congress comes back from recess, we want this request to be one of the first things they see on their desks," said South Carolina Attorney General Alan Wilson, who spearheaded the letter.
Zoom out: Congress has pushed for AI regulation amid growing concerns as the technology has grown in popularity and prevalence.
- OpenAI CEO Sam Altman appeared before Congress in May asking for new rules to limit the dangers the technology could bring about.
- The company does not allow for CSAM or "any content that exploits or harms children," according to its website. It will report any such content to the National Center for Missing and Exploited Children.
- OpenAI has "made significant effort to minimize the potential for our models to generate content that harms children," per a blog post earlier this year.
Our thought bubble via Axios' chief technology correspondent Ina Fried: The challenge with these types of laws lies in wording the language narrowly enough to address concerns without sweeping up other types of content, especially those of marginalized groups. Past bills aimed at kids' online safety have covered material related to consenting adults, especially LGBTQ-related content.