
Illustration: Sarah Grillo/Axios
China touted emotion recognition systems as a means of crime prevention at its 2019 Public Security Expo, Financial Times reports (subscription), although experts say the tech doesn't work as advertised.
Reality check: "The science on emotion recognition is pretty bogus," ACLU senior policy analyst Jay Stanley tells Axios. A July study found that it is not possible to confidently assign emotional states to facial expressions "regardless of context, person, and culture" — "as much of current technology tries to do."
What's happening: China says it's rolling out the tech in Xinjiang, where Uighur Muslims are kept in mass detainment camps, and in subway stations and airports to "identify criminal suspects," per FT.
“At present only a few schools and public security bureaus have products that include this type of technology,” Zhen Wenzhuang told FT, adding that emotion recognition has "not been fully developed for commercial use" in China.
Between the lines: Even if the tech doesn't track emotions as advertised, being watched or even thinking you're being watched can still have a psychological effect and encourage people to change their behavior, as seen in workplace polling.
In the U.S., Microsoft claims that its Face API program can identify emotions like contempt, happiness and disgust. Amazon's Reokognition points out that when its API identifies someone's facial expression, it "is not a determination of the person’s internal emotional state."
Go deeper: AI is "awakening" surveillance cameras