Report: Amazon's AI recruiter favored men
An algorithmic recruiter meant to help Amazon find top talent was systematically biased against women, a Reuters investigation found.
Why it matters: This is a textbook example of algorithmic bias. By learning from and emulating human behavior, a machine ended up as prejudiced as the people it replaced.
The details: Amazon's experiment, which dates back to 2014, was trained on 10 years of job applications, most of which came from men, reports Reuters' Jeffrey Dastin.
- The system concluded that men were better candidates for technical jobs.
- In 2015, Amazon began to realize that the system was penalizing resumes that included the word "women’s" (as in a women’s sports team or all-women’s colleges).
The company intervened to remove the negative weights on these words, but couldn’t be certain that other, similar problems wouldn’t crop up.
- Reuters reported that Amazon recruiters used its recommendations but didn’t rely on them entirely.
- An Amazon spokesperson disputed this, saying, “This was never used by Amazon recruiters to evaluate candidates.”
- The company dissolved the team in charge of the system early last year, in part because the system was not surfacing useful candidates.
What’s next: Many large companies — including Goldman Sachs and Hilton — already use AI in their recruiting process, and the list will only grow.
- But companies are still hoping that properly trained AI can not only avoid algorithmic bias but also correct human recruiters’ prejudices.