
Illustration: Aïda Amer/Axios
Advanced surveillance technology is being deployed despite flaws that risk perpetuating racial biases in the criminal justice system.
The big picture: Even with recent improvements in the tech, people of color are more likely to be misidentified by facial recognition software — an error that can have life-changing results. And predictive systems can reinforce biased over-policing of neighborhoods of color.
What's new: People of color — especially black Americans — are more likely to distrust facial recognition technology used for policing, according to a new Pew Research poll.
- Pew found that 61% of white Americans trust police to use the technology responsibly, versus 56% of Hispanic Americans and 43% of black Americans.
- Several surveys have shown people of color are concerned about facial recognition. That's because they're most likely to come into contact with it due to over-policing, says Clare Garvie, a privacy expert at Georgetown Law.
Details: Accuracy problems arise when a facial recognition system is trained on a dataset of mostly white and male faces. And they're amplified by mugshot databases where people of color are disproportionately represented, making for more possible mismatches.
- IBM, Face ++ and Microsoft misidentified darker-skinned women more than any other demographic in a 2018 MIT Media Lab study.
- All 3 companies have reduced those accuracy issues, but researchers found in 2019 that Amazon's facial analysis and recognition system, Rekognition, has the worst error rate for identifying darker-skinned women. Amazon has disputed the results and methods of the MIT testing.
- In 9 U.S. cities, police data generated during unlawful and biased police practices was available — and likely used, researchers believe — to train or inform predictive policing systems, according to research by the AI Now Institute at New York University.
What's happening: Citizen-driven surveillance is reinforcing biases in other ways. Security-oriented apps and devices that allow people to alert neighbors to sketchy activity can turn prejudices into police action.
- The majority of people described as "suspicious" in over 100 user-submitted posts to the Neighbors by Ring app were people of color, a Motherboard analysis found.
- At least 405 law enforcement agencies use Neighbors, in which "descriptions often use racist language or make racist assumptions about the people shown," per Motherboard.
- These apps are some of the most downloaded social and news apps in the country. Nextdoor is ranked #4 in News on the Apple Store, Citizen is #9 in News, and Neighbors by Ring is ranked #21 in Social Networking.
What to watch: Body cameras equipped with facial recognition have not been used in the field, but vendors like COBAN Technologies have started to market the tech to police, says Dave Maass, senior investigative researcher at the Electronic Frontier Foundation.
- The ACLU is supporting legislation in California to ban police from incorporating facial recognition into body cameras.
- Axon, the biggest vendor of police body cameras, said in June that it won't sell facial recognition technology. Whether other vendors will follow suit has yet to be seen.
- The National Institute of Standards and Technology's report on demographic dependencies in facial recognition will be out this fall, NIST's Patrick Grother tells Axios.
Go deeper: