
Illustration: Tiffany Herring/Axios
Advocates are pressing Congress to pass the DEFIANCE Act to protect people from image-based sexual abuse as the limits on voluntary commitments emerge.
Why it matters: Child sexual abuse material (CSAM) and non-consensual intimate images (NCII) of adults are skyrocketing with the proliferation of generative AI, and some observers say that voluntary commitments to combat the abuse only go so far.
- The DEFIANCE Act would hold the perpetrators accountable by creating a federal civil right of action for people who are victims of intimate digital forgeries.
- While it offers one solution to this crisis, others want a more targeted way to hold platforms accountable.
Our thought bubble: The DEFIANCE Act could have a better chance of becoming law this year than other tech measures for two big reasons.
- One, it's bipartisan. Two, it doesn't go after tech companies' Section 230 liability shield and instead focuses on holding the perpetrators accountable.
State of play: Adobe, Anthropic, Cohere, Common Crawl, Microsoft and OpenAI last week signed on to voluntary commitments spearheaded by the White House to curb NCII and CSAM.
- The commitments include responsibly sourcing datasets, stress testing to guard against abusive output and removing nude images from AI training datasets.
"We absolutely still think that there's space for congressional action," Center For Democracy & Technology CEO Alexandra Reeve Givens told Axios, pointing to the DEFIANCE Act as a good solution.
The DEFIANCE Act would also more than double the statute of limitations to 10 years.
- Sen. Dick Durbin, Sen. Lindsey Graham, Rep. Alexandria Ocasio-Cortez and Rep. Laurel Lee are behind the bill.
- The bill passed the Senate by unanimous consent this summer. In the House, AOC recently hosted a roundtable with actor Sophia Bush to press for action.
Yes, but: Some advocates believe the platforms themselves should bear greater responsibility.
- "It's amazing to me that the companies who produced [image-based sexual abuse] and are hosting it appear to have faced no legal consequences," former Meta employee and senior policy advisor at UC Berkeley David Evan Harris told Axios.
During a Senate Judiciary subcommittee hearing this week, Harris said companies like HuggingFace have not taken down versions of Stable Diffusion 1.5, a model that has been trained on thousands of CSAM images.
- HuggingFace's Margaret Mitchell during the hearing responded that she was not aware that the company was still hosting derivative models and said it's an example of why companies need government help.
Between the lines: Companies are struggling to adhere to laws on the books while also trying to keep to new promises to curb image-based sexual assault. Some are looking for more clarity from Congress.
