Illustration: Sarah Grillo/Axios

We're seeing the beginnings of a tug-of-war at the highest levels of government over how much access people should have to AI systems that make critical decisions about them.

What's happening: Life-changing determinations, like the length of a criminal's sentence or the terms of a loan, are increasingly informed by AI programs. These can churn through oodles of data to detect patterns invisible to the human eye, potentially making more accurate predictions than before.

Why it matters: The systems are so complex that it can be hard to know how they arrive at answers — and so valuable that their creators often try to restrict access to their inner workings, making it potentially impossible to challenge their consequential results.

Driving the news: Two recent proposals are pulling in opposite directions.

  • A bill from Rep. Mark Takano, a California Democrat, would block companies that design AI systems for criminal justice from withholding details about their algorithms by claiming they’re trade secrets.
  • A proposal from the Department of Housing and Urban Development (HUD) would protect landlords, lenders and insurers that want to use algorithms for important determinations, shielding them from claims that the algorithms unintentionally have a more negative impact on certain groups of people.

These are among the earliest attempts to set down rules and definitions for algorithmic transparency. How they shake out could set rough precedents for how the government will approach the many future questions that will emerge.

Proponents of more access say it's vital to test whether walled-off systems are making serious mistakes or unfair determinations — and argue that the potential for harm should outweigh companies' interest in protecting their secrets.

  • Developers regularly invoke trade-secret rights to keep their algorithms — used for key evidence like DNA matches or bullet traces — away from the accused, says Rebecca Wexler, a UC Berkeley law professor who consulted on Takano's bill.
  • "We need to give defendants the rights to get the source code and [not] allow intellectual property rights to be able to trump due process rights," Takano tells Axios. His bill also asks the government to set standards for forensic algorithms and test every program before it is used.

The HUD proposal would require someone to show that an algorithmic decision was based on an illegal proxy, like race or gender, in order to succeed in a lawsuit. But critics say that can be impossible to determine without understanding the system.

  • "By creating a safe harbor around algorithms that do not use protected class variables or close proxies, the rule would set a precedent that both permits the proliferation of biased algorithms and hampers efforts to correct for algorithmic bias," says Alice Xiang, a researcher at the Partnership on AI.
  • HUD is soliciting comments on the proposal until later this month.

The other side: "The goal here is to bring more certainty into this area of the law," said HUD General Counsel Paul Compton in an August press conference. He said the proposal "frees up parties to innovate, take risks and meet the needs of their customers without the fear that their efforts will be second-guessed through statistics years down the line."

Go deeper

Updated 2 hours ago - Politics & Policy

Coronavirus dashboard

Illustration: Sarah Grillo/Axios

  1. Global: Total confirmed cases as of 9 p.m. ET: 18,187,396 — Total deaths: 691,352 — Total recoveries — 10,841,436Map.
  2. U.S.: Total confirmed cases as of 9 p.m. ET: 4,711,323 — Total deaths: 155,379 — Total recoveries: 1,513,446 — Total tests: 57,543,852Map.
  3. Politics: White House will require staff to undergo randomized coronavirus testing — Pelosi says Birx "enabled" Trump on misinformation.
  4. Sports: 13 members of St. Louis Cardinals test positive, prompting MLB to cancel Tigers series — Former FDA chief says MLB outbreaks should be warning sign for schools.
  5. 1 🎥 thing: "Tenet" may be the first major film to get a global pandemic release.

In photos: Thousands evacuated as Southern California fire grows

A plane makes a retardant drop on a ridge at the Apple Fire north of Banning in Riverside County, which "doubled in size" Saturday, per KTLA. Robert Gauthier / Los Angeles Times via Getty Images

A massive wildfire that prompted mandatory evacuations in Southern California over the weekend burned 26,450 acres and was 5% contained by Monday afternoon, the California Department of Forestry and Fire Protection said.

The big picture: As California remains an epicenter of the coronavirus pandemic in the U.S., some 15 separate fires are raging across the state. About 7,800 people were under evacuation orders from the Apple Fire, about 75 miles east of Los Angeles, as hundreds of firefighters battled the blaze. CalFire said Monday that a malfunction involving a "diesel-fueled vehicle emitting burning carbon from the exhaust system" started the Apple Fire.

Twitter faces FTC fine of up to $250 million over alleged privacy violations

Photo: Rafael Henrique/SOPA Images/LightRocket

The Federal Trade Commission has accused Twitter of using phone numbers and emails from its users to make targeted ads between 2013 and 2019, Twitter said in an SEC filing published Monday.

Why it matters: Twitter estimates that the FTC's draft complaint, which was sent a few days after its Q2 earnings report, could cost the company between $150 million and $250 million. The complaint is unrelated to the recent Twitter hack involving a bitcoin scam.