July 30, 2024
It's Tuesday. And we're back in your inbox after the Senate passed the KOSA/COPPA 2.0 package.
- Now the legislation heads over to the House, where backers need to get key players on board if they want the legislation to become law.
ICYMI: The House is banning staff use of ByteDance apps, Axios has learned.
1 big thing: NTIA head weighs in on the open source debate
The National Telecommunications and Information Administration has decided open source AI models should be embraced, in a highly anticipated report released today.
- Maria chatted with NTIA head Alan Davidson as the debate between open and closed source backers heats up.
What will the long term impact of this report be on the debate and the way AI itself evolves?
If you think about the open source software conversation from years ago, it turns out that making software open source actually was viewed as making it safer because lots of people could look at it, understand it, help find bugs in the software.
- There's an analog here that I think will be an important point as we think about the evolution of these systems.
There are huge business interests on either side of this debate. How did you try to disentangle those from the security argument and then ultimately decide what's best for national interests?
We got a huge number of comments back from across the spectrum of interests. And we concluded, based on that very broad look, that there's value in focusing on allowing openness but putting in place a framework to monitor future risks.
Are you expecting pushback?
There are strongly held views on a lot of aspects of these issues.
- We know that there were some who feel that the risks are very high, but we think that we don't have the capability today to really measure the marginal risk of open systems very precisely.
- That's why we have adopted this monitor, but not mandate approach.
It seems like you're going to require a lot of different types of information in order to have an effective monitoring system. Am I right in understanding that this is the reporting requirement in the AI executive order?
These are not necessarily requirements in our report or mandates we do. It is right to note that many of these sources of data should exist soon through the work of the AI Safety Institute here in the U.S., but we're really talking at a high level about the information that would be needed.
- It may ultimately be that we need to have requirements in place.
- But the hope right now is that there'll be more information sharing and more active research, and this is an area where a government can help.
What are the next steps now that the report is out?
Helping people in the stakeholder community understand it because it's a fairly nuanced report. We think that this path we've laid out is one where we really need industry to start engaging on soon.
- We also want to talk to our our international partners about this.
- And, of course, we're briefing Congress, and I know there's a lot of interest on Capitol Hill.
2. QOTD: Kids online safety package passes
"Young people will take back control over their online lives. Parents will have tools to safeguard those young people. We are on the cusp of a new era. It is an era of accountability for Big Tech."ā KOSA co-sponsor Sen. Richard Blumenthal, speaking at a press conference to celebrate the kids online safety package passing today.
ā Thank you for reading Axios Pro Policy, and thanks to editors Mackenzie Weinger and David Nather and copy editor Bryan McBournie.
- Do you know someone who needs this newsletter? Have them sign up here.
View archive



