Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Denver news in your inbox
Catch up on the most important stories affecting your hometown with Axios Denver
Des Moines news in your inbox
Catch up on the most important stories affecting your hometown with Axios Des Moines
Minneapolis-St. Paul news in your inbox
Catch up on the most important stories affecting your hometown with Axios Twin Cities
Tampa Bay news in your inbox
Catch up on the most important stories affecting your hometown with Axios Tampa Bay
Charlotte news in your inbox
Catch up on the most important stories affecting your hometown with Axios Charlotte
Illustration: Sarah Grillo/Axios
China touted emotion recognition systems as a means of crime prevention at its 2019 Public Security Expo, Financial Times reports (subscription), although experts say the tech doesn't work as advertised.
Reality check: "The science on emotion recognition is pretty bogus," ACLU senior policy analyst Jay Stanley tells Axios. A July study found that it is not possible to confidently assign emotional states to facial expressions "regardless of context, person, and culture" — "as much of current technology tries to do."
What's happening: China says it's rolling out the tech in Xinjiang, where Uighur Muslims are kept in mass detainment camps, and in subway stations and airports to "identify criminal suspects," per FT.
“At present only a few schools and public security bureaus have products that include this type of technology,” Zhen Wenzhuang told FT, adding that emotion recognition has "not been fully developed for commercial use" in China.
Between the lines: Even if the tech doesn't track emotions as advertised, being watched or even thinking you're being watched can still have a psychological effect and encourage people to change their behavior, as seen in workplace polling.
In the U.S., Microsoft claims that its Face API program can identify emotions like contempt, happiness and disgust. Amazon's Reokognition points out that when its API identifies someone's facial expression, it "is not a determination of the person’s internal emotional state."
Go deeper: AI is "awakening" surveillance cameras