Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on the day's biggest business stories
Subscribe to Axios Closer for insights into the day’s business news and trends and why they matter
Stay on top of the latest market trends
Subscribe to Axios Markets for the latest market trends and economic insights. Sign up for free.
Sports news worthy of your time
Binge on the stats and stories that drive the sports world with Axios Sports. Sign up for free.
Tech news worthy of your time
Get our smart take on technology from the Valley and D.C. with Axios Login. Sign up for free.
Get the inside stories
Get an insider's guide to the new White House with Axios Sneak Peek. Sign up for free.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Want a daily digest of the top Denver news?
Get a daily digest of the most important stories affecting your hometown with Axios Denver
Want a daily digest of the top Des Moines news?
Get a daily digest of the most important stories affecting your hometown with Axios Des Moines
Want a daily digest of the top Twin Cities news?
Get a daily digest of the most important stories affecting your hometown with Axios Twin Cities
Want a daily digest of the top Tampa Bay news?
Get a daily digest of the most important stories affecting your hometown with Axios Tampa Bay
Want a daily digest of the top Charlotte news?
Get a daily digest of the most important stories affecting your hometown with Axios Charlotte
Photo illustration: Lorenzo Di Cola/NurPhoto via Getty Images)
Digital civil rights group Access Now is sending a letter to Spotify CEO Daniel Ek imploring the company to abandon a technology it has patented to detect emotion, gender and age using speech recognition, Axios has learned.
Why it matters: While many of us in theory want our computers to understand who we are and what we want, the industry too often doesn't think through how its innovations will affect different kinds of people or what harm its collection of data can cause.
In its letter, Access Now says technology that aims to determine a person's mood and demographics based on their speech could be used to manipulate human emotion and is likely to lead to discrimination."This technology is dangerous, a violation of privacy and other human rights, and should be abandoned," Access Now says in its letter to Spotify, which was obtained by Axios.
Between the lines: Access Now highlights four areas of particular concern.
- Emotion manipulation: "Serious doubts have been raised about the scientific basis of emotion recognition technology and whether it works. While the majority of criticism has focused on inferring emotion using facial recognition systems, many of these criticisms apply equally to speech-based approaches."
- Gender discrimination: "You cannot infer gender without discriminating against trans and non-binary people. If you infer gender, according to a male-female binary from voice data, you will likely misgender trans people, and place non-binary people into a gender binary that undermines their identity."
- Privacy violations: "Based on reporting, the device would always be on, which means that it would be constantly monitoring, processing voice data, and likely ingesting sensitive information. ... No one wants a machine listening in on their most intimate conversations."
- Data security: "Harvesting this kind of data could make Spotify a target for third parties seeking information, from snooping government authorities to malicious hackers."
Our thought bubble: Information we give to companies for reasons of convenience becomes tough to claw back when they start to use it in ways that make us unhappy.
- "Mood detection" and "emotional state" are particularly fuzzy categories fraught with both ethical and practical pitfalls.
Of note: Just because Spotify has received a patent doesn't mean the company intends to build or deploy the feature.