Axios Pro Exclusive Content

AI startups and academics get louder in D.C.

headshot
Feb 6, 2024
Illustration of two megaphones facing off, shouting at each other with abstract vector shapes behind the megaphones indicating loud noises.

Illustration: Aïda Amer/Axios

Smaller AI companies and researchers are amping up their presence in Washington to make sure their voices are heard in regulatory debates.

Why it matters: AI regulation will have a long-lasting effect on both the industry and the country's global competitiveness, and industry players want to make sure government officials get it right.

Driving the news: The co-founder of Andreessen Horowitz, a top Silicon Valley venture capital firm, was in D.C. last week to represent the startup community in meetings with lawmakers and White House officials.

  • Late last month, Stanford University's Hoover Institution launched its Emerging Technology Review in D.C. as an education resource for policymakers. Academics met with offices on Capitol Hill and administration officials to present their report.
  • And Cohere just tapped Melika Carroll, who has more than 20 years of experience in startups, as its first head of government relations, per an announcement shared first with Axios.
  • Carroll's priorities include combating bias and disinformation and supporting open competition.

In an interview at Axios' headquarters, Ben Horowitz said there are parts of the Biden administration's AI executive order that are "stunningly obviously regulatory capture," including the approach to open-source software.

  • Government officials fear open-source models pose national security threats because the code can land in the hands of bad actors.
  • The EO instructs NTIA to study the risks and opportunities of open-source AI models and recommend actions.
  • But there's concern the EO is the first step toward curbing open-source models, which could shut out smaller players and academia.

What they're saying: "Shame on us for not being in the conversation enough in the first round; that's definitely our fault," Horowitz said.

  • At a dinner hosted by the Hoover Institution last month, Stanford senior researcher Amy Zegart called for more investment in R&D:
  • "It's hard to look long term with so much today having to do with national security. So the lament that we have is about the erosion of the foundation of innovation and basic R&D. That's a bipartisan blind spot."
  • Rep. Don Beyer, speaking Tuesday about the CREATE AI Act to authorize the National AI Research Resource, said that "what we want to do is create our own huge database."
  • "If we can do the CREATE AI Act, we can then give universities, companies, small companies the democratization of AI by giving them a database resource to use for all these searches," he said during a Washington Post event.

Catch up fast: Closed-source supporters have played an outsized role in shaping the regulatory conversation so far, open-source backers say.

  • That includes OpenAI CEO Sam Altman, who made his presence known early on, testifying on the Hill last May and meeting with dozens of lawmakers.
  • Altman's advocacy for a licensing regime is a red flag for the open-source community, which thinks it will shut out many developers.
  • Anthropic, another closed-source backer, was also recently making the rounds in D.C.

Of note: Horowitz is also pushing his view that the AI executive order's use of the FLOP (floating-point operations, or the amount of compute used to train AI models) threshold to determine whether a company needs to comply with its reporting requirements is too difficult to enforce.

  • He says content should be authenticated using a digital key with thousands of validators instead of having to trust a specific company or administration.
  • Stanford is pushing for passage of the CREATE AI Act.

What's next: As another way to shape policymaking, Horowitz said, his firm is gearing up to get more involved in the elections by making donations, but he declined to elaborate.

Go deeper