Stories

Why it matters: Facial recognition's racial bias problem

Illustration: Sarah Grillo/Axios

Amazon’s facial recognition system, Rekognition, falsely matched 28 U.S. congressmen with criminal mugshots, the American Civil Liberties Union wrote in a blog post Thursday.

Why it matters: The result of the test, which compared all 535 members of Congress against 25,000 public mugshots, fuels the debate over whether algorithmic bias singles out more minorities by law enforcement.

An Amazon Web Services spokesperson told us the service should be used "to review and consider options using their judgement and not to make fully autonomous decisions."

  • Still, there's nothing to prevent how law enforcement agencies use Amazon's facial recognition system.

The details:

  • About 40% of Rekognition’s false matches in ACLU’s test were people of color. Only 20% of the current members of Congress are people of color, showing false matches affected that group at a higher rate.
  • Six incorrect matches were members of the Congressional Black Caucus, including civil rights leader Rep. John Lewis.
  • The technology could further bias an officer, as individuals start “being questioned or having their home searched, based on a false identification,” ACLU said.
  • There is no data from Rokognition on bias testing, per The Verge.
  • The New York Times tested out Rekognition's API in June showing how facial recognition could help identify politicians while reporting on Capitol Hill. Though in one instance, Rekognition incorrectly identified  Florida Sen. Bill Nelson with actor Bill Paxton.

What they're saying:

  • The Congressional Black Congress sent a letter to Amazon CEO Jeff Bezos in May about the risks of Rekognition’s usage by law enforcement.
  • In June, a group of Amazon employees sent a letter to Bezos concerned about the ethics of the technology when used by Immigration and Customs Enforcement, per The Hill.
  • Officers from Hillsboro, Ore., expressed concern through email exchanges about how “Big Brother” a partnership with Amazon may appear, per Gizmodo.

The other side: In ACLU's particular situation, Amazon Rekognition is not meant to be the sole decider, an Amazon Web Services spokesperson tells Axios.

“While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95% or higher.”
— AWS spokesperson

The bottom line: Facial recognition systems have long struggled with high error rates for women and people of color, raising questions about their broader use, especially in a public safety context.

Go deeper: