Rumman Chowdhury leads Accenture's responsible AI efforts. Photo: Frances Denny

With computer algorithms being called on to make more and bigger decisions, a growing field has emerged to help ensure the models are fair and free of bias. Among the latest efforts is a new "fairness tool" that consulting giant Accenture is detailing at an AI conference next week.

Why it matters: AI is being used to make an increasing array of decisions from who gets parole to whether someone is offered a loan or job. But without rooting out bias in both training data and models, these algorithms risk simply codifying existing human misperceptions.

Accenture is far from alone in trying to develop tools to remove bias from AI.

  • At F8, Facebook talked about Fairness Flow, a tool it says it's using to seek out biases for or against a particular group of people.
  • Recruiting startup Pymetrics developed Audit-AI to root out bias in its own algorithms for determining if a candidate is a good fit for a job. Now the company is releasing it as open source in hopes others may benefit:
"We believe that all creators of technology are responsible for creating the future that we want to live in. For us, that future is one that is bias-free."

How it works: Accenture's tool looks at both the data used to train a model as well as the algorithm itself to see if there are any places where any particular group is being treated unfairly.

Origin story: Rumman Chowdhury, who leads responsible AI at Accenture Applied Intelligence, developed what became the fairness tool with the assistance of a study group of researchers at the Alan Turing Institute. The tool is being formally announced next week at CogX in London.

More people in the room: One of the benefits, Chowdhury said, is that you don't have to be an experienced coder to make use of the tool. That helps promote another important means of combating AI bias: making sure more people are part of the discussion.

"It’s a really good way to start incorporating different people into the AI development process, people who aren't necessarily data scientists."
— Rumman Chowdhury to Axios.

Yes, but: Chowdhury notes the fairness tool isn't a silver bullet. It works best on certain types of models, known as classification models, and needs fixed, rather than continuous, variables.

"I don't want people to think you can push a button and fix for fairness because you can’t. While this is one tool that certainly does help, it doesn’t solve for everything."
— Chowdhury
  • Also, correcting for bias can make an algorithm more fair, but sometimes at the expense of accuracy.

Go deeper: Another key component of ethical AI is transparency. Check out this article for more on the push to create AI that can show its work.

Go deeper

Updated 8 mins ago - Politics & Policy

Coronavirus dashboard

Illustration: Sarah Grillo/Axios

  1. Global: Total confirmed cases as of 7 a.m. ET: 32,870,631 — Total deaths: 994,534 — Total recoveries: 22,749,163Map.
  2. U.S.: Total confirmed cases as of 7 a.m. ET: 7,079,689 — Total deaths: 204,499 — Total recoveries: 2,750,459 — Total tests: 100,492,536Map.
  3. States: New York daily cases top 1,000 for first time since June — U.S. reports over 55,000 new coronavirus cases.
  4. Health: The long-term pain of the mental health pandemicFewer than 10% of Americans have coronavirus antibodies.
  5. Business: Millions start new businesses in time of coronavirus.
  6. Education: Summer college enrollment offers a glimpse of COVID-19's effect.

How the Supreme Court could decide the election

Illustration: Sarah Grillo/Axios

The Supreme Court isn't just one of the most pressing issues in the presidential race — the justices may also have to decide parts of the election itself.

Why it matters: Important election-related lawsuits are already making their way to the court. And close results in swing states, with disputes over absentee ballots, set up the potential for another Bush v. Gore scenario, election experts say.

Graham hopes his panel will approve Amy Coney Barrett by late October

Sen. Lindsey Graham during a Senate Judiciary Committee hearing on Sept. 24, 2020 in Washington, DC. Photo: Win McNamee/Getty Images

Senate Judiciary Committee Chair Lindsey Graham (R-S.C.) told Fox News Saturday he expects confirmation hearings on Judge Amy Coney Barrett's nomination to the Supreme Court to start Oct. 12 and for his panel to approve her by Oct. 26.

Why it matters: That would mean the final confirmation vote could take place on the Senate floor before the Nov. 3 presidential election.