
Photo illustration: Sarah Grillo/Axios. Photo: Eric Lee/Bloomberg via Getty Images
The National Telecommunications and Information Administration has decided open source AI models should be embraced, in a highly anticipated report released Tuesday.
- Axios chatted with NTIA head Alan Davidson as the debate between open and closed source backers heats up.
The following conversation has been edited and condensed for clarity.
What will the long term impact of this report be on the debate and the way the AI itself evolves?
If you think about the open source software conversation from years ago, it turns out that making software open source actually was viewed as making it safer because lots of people could look at it, understand it, help find bugs in the software.
- There's an analog here that I think will be an important point as we think about the evolution of these systems.
There are huge business interests on either side of this debate. How did you try to disentangle those from the security argument and then ultimately decide what's best for national interests?
We got a huge number of comments back from across the spectrum of interests. We also did a number of listening sessions, and that gave us a chance to hear from a pretty broad cross section of industry and civil society and other governments that we spoke with.
- And we concluded, based on that very broad look, that there's value in focusing on allowing openness but putting in place a framework to monitor future risks.
Are you expecting pushback?
There are strongly held views on a lot of aspects of these issues.
- We know that there were some who feel that the risks are very high, but we think that we don't have the capability today to really measure the marginal risk of open systems very precisely.
- That's why we have adopted this monitor, but not mandate approach.
It seems like you're going to require a lot of different types of information in order to have an effective monitoring system. Am I right in understanding that this is the reporting requirement in the AI executive order?
These are not necessarily requirements in our report or mandates we do. It is right to note that many of these sources of data should exist soon through the work of the AI Safety Institute here in the U.S., but we're really talking at a high level about the information that would be needed.
- It may ultimately be that we need to have requirements in place.
- But the hope right now is that there'll be more information sharing and more active research, and this is an area where a government can help.
Do you see the executive order's reporting requirement surviving a potential Trump administration or does transparency legislation need to be passed?
There are people on both sides of the aisle that want innovation to be supported here in the U.S. on AI, but also care about protecting privacy and security and the safety of these systems.
- And so we're very hopeful that the approaches that we've outlined here will be sustainable over time, because they they represent an approach that has bipartisan support.
How long will it take to get the evaluation system in in place and who are the key players?
You can imagine building up government capabilities to do a lot of this, but we also believe that a fair amount of this can happen within the research community itself, within industry itself.
- We think these capabilities could be stood up fairly quickly.
- The evaluation part is the critical place for government to be involved, so we do need to build out more capacity. We do need to have those tools to do this work.
What are the next steps now that the report is out?
Helping people in the stakeholder community understand it because it's a fairly nuanced report. We think that this path we've laid out is one where we really need industry to start engaging on soon.
- We also want to talk to our our international partners about this. A number of other countries are looking at these same issues and we're keen to find ways to harmonize the work with them.
- And, of course, we're briefing Congress, and I know there's a lot of interest on Capitol Hill.
