Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Denver news in your inbox
Catch up on the most important stories affecting your hometown with Axios Denver
Des Moines news in your inbox
Catch up on the most important stories affecting your hometown with Axios Des Moines
Minneapolis-St. Paul news in your inbox
Catch up on the most important stories affecting your hometown with Axios Twin Cities
Tampa Bay news in your inbox
Catch up on the most important stories affecting your hometown with Axios Tampa Bay
Charlotte news in your inbox
Catch up on the most important stories affecting your hometown with Axios Charlotte
Lazaro Gamio / Axios
Artificial intelligence algorithms can indeed create a world that distributes resources more efficiently and, in theory, can offer more for everyone.
Yes, but: If we aren't careful, these same algorithms could actually lead to greater discrimination by codifying the biases that exist both overtly and unconsciously in human society. What's more, the power to make these decisions lies in the hands of Silicon Valley, which has a decidedly mixed record on spotting and addressing diversity issues in its midst.
Airbnb's Mike Curtis put it well when I interviewed him this week at VentureBeat's MobileBeat conference:
"One of the best ways to combat bias is to be aware of it. When you are aware of the biases then you can be proactive about getting in front of them. Well, computers don't have that advantage. They can't be aware of the biases that may have come into them from the data patterns they have seen."
Dig deeper: It also matters what the algorithms are optimizing for. Airbnb, in general, is looking to train its algorithms to learn what factors are most likely to lead to a positive experience for guests when they make their reservation. However, a customer with a racial bias, for example, may be more satisfied when they see only white hosts. But to further Airbnb's goal of an open, non-discriminatory platform, the company has to both recognize this issue, choose to prioritize non-discrimination, and then program accordingly.
Concern is growing:
- The ACLU has raised concerns that age, sex, and race biases are already being codified into the algorithms that power AI.
- ProPublica found that a computer program used in various regions to decide whom to grant parole would go easy on white offenders while being unduly harsh to black ones.
- It's an issue that Weapons of Math Destruction author Cathy O'Neil raised in a popular talk at the TED conference this year. "Algorithms don't make things fair," she said. "They automate the status quo."