September 19, 2024
It's Thursday! And we're staring down another possible government shutdown.
📍 If you're in D.C. next week: Join us Thursday at 8am ET for an event on AI featuring Sen. John Hickenlooper, White House OSTP chief of staff and deputy director for strategy Asad Ramzanali and Center for Democracy & Technology CEO Alexandra Reeve Givens.
- RSVP here.
1 big thing: Why the DEFIANCE Act could pass this Congress
Advocates are pressing Congress to pass the DEFIANCE Act to protect people from image-based sexual abuse as the limits on voluntary commitments emerge, Maria reports.
Why it matters: Child sexual abuse material (CSAM) and non-consensual intimate images (NCII) of adults are skyrocketing with the proliferation of generative AI, and some observers say that voluntary commitments to combat the abuse only go so far.
- The DEFIANCE Act would hold the perpetrators accountable by creating a federal civil right of action for people who are victims of intimate digital forgeries.
- While it offers one solution to this crisis, others want a more targeted way to hold platforms accountable.
Our thought bubble: The DEFIANCE Act could have a better chance of becoming law this year than other tech measures for two big reasons.
- One, it's bipartisan. Two, it doesn't go after tech companies' Section 230 liability shield and instead focuses on holding the perpetrators accountable.
State of play: Adobe, Anthropic, Cohere, Common Crawl, Microsoft and OpenAI last week signed on to voluntary commitments spearheaded by the White House to curb NCII and CSAM.
- The commitments include responsibly sourcing datasets, stress testing to guard against abusive output and removing nude images from AI training datasets.
"We absolutely still think that there's space for congressional action," Center For Democracy & Technology CEO Alexandra Reeve Givens told Axios, pointing to the DEFIANCE Act as a good solution.
The DEFIANCE Act would also more than double the statute of limitations to 10 years.
- Sen. Dick Durbin, Sen. Lindsey Graham, Rep. Alexandria Ocasio-Cortez and Rep. Laurel Lee are behind the bill.
- The bill passed the Senate by unanimous consent this summer. In the House, AOC recently hosted a roundtable with actor Sophia Bush to press for action.
Yes, but: Some advocates believe the platforms themselves should bear greater responsibility.
- "It's amazing to me that the companies who produced [image-based sexual abuse] and are hosting it appear to have faced no legal consequences," former Meta employee and senior policy advisor at UC Berkeley David Evan Harris told Axios.
During a Senate Judiciary subcommittee hearing this week, Harris said companies like HuggingFace have not taken down versions of Stable Diffusion 1.5, a model that has been trained on thousands of CSAM images.
- HuggingFace's Margaret Mitchell during the hearing responded that she was not aware that the company was still hosting derivative models and said it's an example of why companies need government help.
Between the lines: Companies are struggling to adhere to laws on the books while also trying to keep to new promises to curb image-based sexual assault. Some are looking for more clarity from Congress.
2. Exclusive: TechNet endorses seven AI bills
TechNet is endorsing seven AI bills it hopes to see pass Congress, the tech lobbying group tells Ashley.
Why it matters: Advancing bipartisan AI legislation is proving difficult, but having tech industry support could give lawmakers an extra boost to get some bills done by the end of this Congress.
The seven bills are:
- The CREATE AI Act, which would authorize the National Artificial Intelligence Research Resource.
- The AI Grand Challenges Act, which would establish a program to award prizes for AI research.
- The AI Public Awareness and Education Campaign Act, which would direct the Commerce secretary to stand up an AI education campaign for Americans.
- The Technology Workforce Framework Act, to require NIST to develop a an AI workforce framework.
- The EPIC Act, to establish a foundation to support NIST by
providing access to private and non-profit funding. - The TAKE IT DOWN Act, which would require social media sites to remove nonconsensual intimate imagery — including deepfakes — within 48 hours of receiving notice.
- The NSF AI Education Act, to support the agency's education and professional development programs.
What they're saying: "TechNet is endorsing these bipartisan proposals to support the next generation of AI entrepreneurs, bolster our workforce, strengthen AI research, support NIST, and help prevent misuse and deepfakes," said CEO Linda Moore.
Between the lines: These bills are largely non-controversial, focusing on things like research, combating deepfakes, prepping the workforce for AI and educating the public.
3. More Q&A from our webinar
Thanks to everyone who joined for our webinar last week on the tech policy implications of the 2024 election.
- We thought we'd answer a couple of questions that were left hanging when time ran out.
What AI governance and safeguards are being adopted and how will government work partner with private sector and civil society?
The NIST AI risk management framework is being adopted by many businesses and organizations in lieu of more far-reaching federal legislation.
- Frameworks like NIST's and others from places like the Better Business Bureau are helping the private sector adhere to safety and best practice frameworks they can feel good about for now.
- The AI executive order has also begun reshaping the landscape, and we'll be assessing its efforts as it approaches its anniversary next month.
- The EO's mix of leveraging initiatives like NIST's framework and the White House's Blueprint for an AI Bill of Rights alongside voluntary commitments from industry players is a useful way to think of how a Harris presidency might continue to partner with the private sector and civil society.
Do you believe the CREATE AI Act could be included in the NDAA?
The Senate Commerce Committee approved the CREATE AI Act on July 31. And last week, House Science advanced the bill with $2.6 billion authorized over six years.
- Observers say that's encouraging given the original CREATE AI Act text did not authorize any funding.
- With the broad, bipartisan support for the legislation, it has a better chance than other efforts to make it into the NDAA. We'll be tracking this closely in the lame duck.
- "The fact that Republicans and Democrats endorsed a multi-billion-dollar investment in AI research is notable and shows a continued bipartisan motivation to win the AI race," Tony Samp, founding director of the Senate AI Caucus and head of AI policy at DLA Piper, told Axios.
4. Catch me up: Election threats, AI and more
🗳 Election threats talk: After yesterday's Senate Intel Committee hearing with Google, Meta and Microsoft, Sen. Mark Warner said that he's concerned about the immediate 48 hours following the election.
- "If the [presidential election is close], in a number of states, it will take a few days to get the results. ... You could have entities which want to spark more discord online," he told Ashley and other reporters.
📜 Four years later, an FTC study: The results of a 2020 FTC 6B study looking into the data collection and retention policies of social media companies for children and teens found some damning results.
✔️ Bills passed: The House passed two bipartisan communications and tech bills this week, the FUTURE Networks Act and the Launch Communications Act.
🌍 AI office: The UN's AI advisory group has a new report out with seven recommendations to address gaps in governance, including standing up a global AI office.
💰 CHIPS cash: The Commerce Department today announced nearly $5 million in grants to 17 small businesses across nine states under the Small Business Innovation Research Program.
📆 Mark your calendars: The U.S. will convene a global AI safety summit in San Francisco on Nov. 20-21.
- This inaugural meeting of the International Network of AI Safety Institutes is an effort to kickstart collaboration ahead of the AI Action Summit in Paris in February, per the press release.
✅ Thank you for reading Axios Pro Policy, and thanks to editors Mackenzie Weinger and David Nather and copy editor Bryan McBournie.
- Do you know someone who needs this newsletter? Have them sign up here.
View archive





