Biden's "aggressive" AI order will make firms share some test data
The Biden administration's long-awaited executive order on artificial intelligence will require developers of the most powerful AI systems to share critical testing information with the government.
Why it matters: A White House fact sheet on the order outlines the steps companies and the government will be directed to take to foster responsible AI, partly by requiring developers to share the kind of internal testing information that's usually kept private.
- It's a significant transparency-boosting step that may not by welcomed by the secretive industry.
- The order comes as Congress holds insight forums with AI experts to help form legislation and White House officials are about to take part in a U.K. AI summit this week.
White House deputy chief of staff Bruce Reed said: "Biden is rolling out the strongest set of actions any government in the world has ever taken on AI safety, security and trust. It's the next step in an aggressive strategy to do everything on all fronts to harness the benefits of AI and mitigate the risks."
Details: Companies developing models that pose serious risks to public health and safety, the economy or national security will have to notify the federal government when training the model and share results of red-team safety tests before making models public.
- The provision would apply to future models that go beyond a specific compute power threshold and would not lead to any restrictions or removal of existing AI tools in the marketplace, a senior administration official said.
- The provision goes beyond voluntary commitments that the White House garnered from AI companies and requires notification in accordance with the Defense Production Act.
- A senior administration official said the DPA enforcement mechanism was not previewed for industry and it remains to be seen how companies will react.
Ahead of the 2024 elections, the administration is also tackling deepfakes by instructing the Commerce Department to develop guidance for content authentication and watermarking.
- Federal agencies will use the content authentication tools and watermarking to make it easy for Americans to know that government communication is authentic.
- Companies like Adobe have been advocating for federal standards around watermarking AI-generated content and lawmakers have introduced bills to protect artists from AI-generated copies of their work.
Here's a breakdown of other areas included in the order beyond safety and security, according to the White House fact sheet:
1. Privacy: The government will prioritize supporting the development of privacy tools using cutting edge AI systems.
- A Research Coordination Network will be funded to strengthen privacy tools, like cryptography tools.
- Privacy guidance for federal agencies using information from data brokers, particularly personally identifiable data, will be strengthened.
2. Equity and civil rights: Landlords, federal benefits program managers and federal contractors will be given guidance to keep algorithms from being used to exacerbate discrimination.
- The Justice Department and federal civil rights offices will be trained on best practices for investigating and prosecuting AI civil rights violations.
- Best practices on the use of AI in sentencing, parole and probation, pretrial release and detention, risk assessments, surveillance, crime forecasting and predictive policing, and forensic analysis will be developed.
3. Innovation and competition: Small developers and entrepreneurs will be provided technical assistance and resources to commercialize AI, and the Federal Trade Commission will be encouraged to exercise its authorities to promote competition.
- Visa criteria, interviews and reviews will be modernized and streamlined to encourage high-skilled immigrants and nonimmigrants with expertise in critical areas to study, stay and work in the U.S.
What they're saying: "I think the breadth of the executive order is a recognition of the fact that AI policy is like running a decathlon and there's 10 different events here and we don't have the luxury of just picking, of saying, 'We're just going to do safety' or 'We're just going to do equity' or 'We're just going to do privacy.' We have to do all of these things," a senior administration official said.
Yes, but: The Biden administration recognizes executive orders can't replace legislation and continues to call on Congress to pass a law governing AI safety.