
Illustration: Sarah Grillo/Axios
The latest and greatest tool for law enforcement has an existential problem.
Driving the news: A major federal study found "Asian and African American people were up to 100 times as likely to be misidentified than white men," per the Washington Post. It also found "high error rates for 'one-to-one' searches of Asians, African Americans, Native Americans and Pacific Islanders."
Why it matters: The study confirms what independent researchers have long known — but the messenger matters, Axios' Kaveh Waddell says.
- The Trump administration has put the National Institute of Standards and Technology in charge of new federal standards for AI, and some lawmakers have proposed having it audit controversial algorithms before they can be put to use.
- The latest findings could be used to obstruct the rollout of AI found to be biased.
The big picture: Law enforcement is rapidly adopting facial recognition technology.
- "The FBI alone has logged more than 390,000 facial-recognition searches of state driver's license records and other federal and local databases since 2011," the Post notes.
The bottom line: People of color are more likely to distrust facial recognition technology used for policing Axios' Orion Rummler reported this fall.
- Pew found that 61% of white Americans trust the police to use the technology responsibly versus 56% of Hispanic Americans and 43% of black Americans.
Go deeper: