Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Denver news in your inbox
Catch up on the most important stories affecting your hometown with Axios Denver
Des Moines news in your inbox
Catch up on the most important stories affecting your hometown with Axios Des Moines
Minneapolis-St. Paul news in your inbox
Catch up on the most important stories affecting your hometown with Axios Twin Cities
Tampa Bay news in your inbox
Catch up on the most important stories affecting your hometown with Axios Tampa Bay
Charlotte news in your inbox
Catch up on the most important stories affecting your hometown with Axios Charlotte
A new study suggests black people are more likely to get hit by an autonomous vehicle than white people, Vox writes.
Why it matters: The findings are the latest example of how human bias seeps into artificial intelligence. If AVs are trained with data that includes only light-skinned people as examples of what constitutes a "human," they won't recognize dark-skinned people as also "human" in the real world.
Details: The study, by researchers at the Georgia Institute of Technology, tried to determine how accurately state-of-the-art object-detection models, like those used by self-driving cars, detect people from different demographic groups, Vox explains.
- Researchers divided a large dataset of images that contain pedestrians by skin tone.
- Then they compared how often the AI models correctly detected the presence of people in the light-skinned group versus how often they got it right with people in the dark-skinned group.
- Detection of dark-skinned people was 5 percentage points less accurate.
The bottom line: AI, including that in AVs, can be just as biased as their creators and this needs to be addressed.
- Samantha Huang, a senior associate at BMW iVentures, wrote about the problem last fall, after observing while riding in the back of an AV test vehicle that it failed to detect 2 pedestrians who were black.
- Had these engineers come from more racially diverse backgrounds, she wrote, they probably would have been less likely to plug in only images of light-skinned people into their algorithms.
Go deeper: Humans cause most self-driving car accidents