
Photo illustration: Tiffany Herring/Axios; Photo: Win McNamee/Getty Image
The federal government is taking what could be the first steps toward requiring safer, more transparent AI systems as a Commerce Department agency invited public comment to help shape specific policy recommendations.
Why it matters: The move is far short of the comprehensive AI legislation critics have advocated. But with the frenzy over generative AI continuing to grow, the Biden administration is trying to get a head start on a government response to the fast-moving industry.
Driving the news: The Commerce Department's National Telecommunications and Information Administration (NTIA) is asking the public to weigh in on what role the federal government can play to ensure AI algorithms are acting as claimed and not causing harm.
- "We really believe in the promise of AI," Assistant Commerce Secretary Alan Davidson, who runs NTIA, tells Axios. "We do believe it needs to be implanted safely and we’re concerned that’s not happening right now."
Davidson said that the government could take a range of actions to shape AI that don't require new legislation — including mandating audits as part of its procurement standards or offering prizes or bounties to those who find bias within algorithms.
- "We need to start the hard work of actually putting in place processes that are going to make people feel like the (AI) tools are doing what they say they are going to do, that models are behaving," Davidson said.
- The move follows last October's release by the White House of a blueprint for what an AI bill of rights might look like.
AI is largely unregulated around the world today, including in the U.S., although Europe has been developing a wide-ranging AI Act.
Between the lines: Much of the industry supports the development of at least some rules to govern AI, but there's less agreement on what the rules should cover or what might constitute "responsible" AI.
- Initially there was a notion that AI systems need to be explainable — that is, that they should be able to report why they generated a particular response. But the current generation of tools isn't built that way and some say it may no longer be achievable.
- "A lot of knowledgable people feel that explainability would help us a huge amount in promoting responsibility if we could achieve it," Davidson said. "It is really an open question about whether we will be able to do that."
Of note: As part of its inquiry, the agency is also asking whether it should have different rules for especially sensitive areas, such as when AI is involved with decisions related to health care and employment.
- The EU, for example, is looking at evaluating the use of AI in "high risk" areas using stricter standards.
What we're watching: While the federal government is trying to move speedily, its version of "quick" is far slower than the tech industry's normal pace.
- A recent open letter from some industry leaders called for a six-month moratorium on advanced AI development. The proposal itself was controversial, but even if the AI world could agree on it, it's hard to see how six months would be enough time for meaningful regulation to emerge.
- "There's a lot of conversation about, 'Let's pull the plug,' but I'm not sure there is a single plug," Arati Prabhakar, director of the White House Office of Science and Technology Policy, told me in an interview at Axios' recent What's Next Summit.
The big picture: The federal government is also trying to strike a balance between encouraging responsibility without slowing the U.S. down relative to other countries, especially China.
- That said, those goals need not be positioned as opposites. "We think that AI has the power to advance American leadership and our economic opportunities," Davidson said. "I think we also believe with good guardrails innovation flourishes better, flourishes more."
What's next: Public comments will be open for 60 days following publication of the request for comment in the Federal Register. After that, the NTIA will produce a report with specific policy recommendations, Davidson said.