Illustration: Aïda Amer/Axios

Advanced surveillance technology is being deployed despite flaws that risk perpetuating racial biases in the criminal justice system.

The big picture: Even with recent improvements in the tech, people of color are more likely to be misidentified by facial recognition software — an error that can have life-changing results. And predictive systems can reinforce biased over-policing of neighborhoods of color.

What's new: People of color — especially black Americans — are more likely to distrust facial recognition technology used for policing, according to a new Pew Research poll.

  • Pew found that 61% of white Americans trust police to use the technology responsibly, versus 56% of Hispanic Americans and 43% of black Americans.
  • Several surveys have shown people of color are concerned about facial recognition. That's because they're most likely to come into contact with it due to over-policing, says Clare Garvie, a privacy expert at Georgetown Law.

Details: Accuracy problems arise when a facial recognition system is trained on a dataset of mostly white and male faces. And they're amplified by mugshot databases where people of color are disproportionately represented, making for more possible mismatches.

  • IBM, Face ++ and Microsoft misidentified darker-skinned women more than any other demographic in a 2018 MIT Media Lab study.
  • All 3 companies have reduced those accuracy issues, but researchers found in 2019 that Amazon's facial analysis and recognition system, Rekognition, has the worst error rate for identifying darker-skinned women. Amazon has disputed the results and methods of the MIT testing.
  • In 9 U.S. cities, police data generated during unlawful and biased police practices was available — and likely used, researchers believe — to train or inform predictive policing systems, according to research by the AI Now Institute at New York University.

What's happening: Citizen-driven surveillance is reinforcing biases in other ways. Security-oriented apps and devices that allow people to alert neighbors to sketchy activity can turn prejudices into police action.

  • The majority of people described as "suspicious" in over 100 user-submitted posts to the Neighbors by Ring app were people of color, a Motherboard analysis found.
  • At least 405 law enforcement agencies use Neighbors, in which "descriptions often use racist language or make racist assumptions about the people shown," per Motherboard.
  • These apps are some of the most downloaded social and news apps in the country. Nextdoor is ranked #4 in News on the Apple Store, Citizen is #9 in News, and Neighbors by Ring is ranked #21 in Social Networking.

What to watch: Body cameras equipped with facial recognition have not been used in the field, but vendors like COBAN Technologies have started to market the tech to police, says Dave Maass, senior investigative researcher at the Electronic Frontier Foundation.

  • The ACLU is supporting legislation in California to ban police from incorporating facial recognition into body cameras.
  • Axon, the biggest vendor of police body cameras, said in June that it won't sell facial recognition technology. Whether other vendors will follow suit has yet to be seen.
  • The National Institute of Standards and Technology's report on demographic dependencies in facial recognition will be out this fall, NIST's Patrick Grother tells Axios.

Go deeper:

Go deeper

Mike Allen, author of AM
5 mins ago - Politics & Policy

Trump's next moves in Supreme Court fight

Photo: Peter Zay/Anadolu Agency via Getty Images

President Trump's choices to succeed Justice Ruth Bader Ginsburg are down to two women, both federal appeals court judges.

The frontrunners are Amy Coney Barrett of Chicago, the early favorite, and Barbara Lagoa, who is viewed as easier to confirm. The Senate confirmed Lagoa 80-15 last year, so many Democrats have already voted for her.

The TikTok deal's for-show provisions and flimsy foundations

Illustration: Aïda Amer/Axios

The new deal to rescue TikTok from a threatened U.S. ban — full of provisions aimed at creating the temporary appearance of a presidential win — looks like a sort of Potemkin village agreement.

How it works: Potemkin villages were fake-storefront towns stood up to impress a visiting czar and dignitaries. When the visitors left, the stage set got struck.

  • Similarly, many elements of this plan look hastily erected and easily abandoned once the spotlight moves on.
1 hour ago - Technology

Over 3 million U.S. voters have already registered on social media

Illustration: Eniola Odetunde/Axios

An estimated 2.5 million+ Americans have registered to vote on Facebook, Instagram, and Messenger, Facebook announced Monday. More than 733,000 Americans have registered to vote so far via Snapchat.

Why it matters: The broad reach of social media platforms makes them uniquely effective at engaging voters — especially younger voters who may not know how to register to vote or be civically engaged.