Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Stay on top of the latest market trends
Subscribe to Axios Markets for the latest market trends and economic insights. Sign up for free.
Sports news worthy of your time
Binge on the stats and stories that drive the sports world with Axios Sports. Sign up for free.
Tech news worthy of your time
Get our smart take on technology from the Valley and D.C. with Axios Login. Sign up for free.
Get the inside stories
Get an insider's guide to the new White House with Axios Sneak Peek. Sign up for free.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Want a daily digest of the top Denver news?
Get a daily digest of the most important stories affecting your hometown with Axios Denver
Want a daily digest of the top Des Moines news?
Get a daily digest of the most important stories affecting your hometown with Axios Des Moines
Want a daily digest of the top Twin Cities news?
Get a daily digest of the most important stories affecting your hometown with Axios Twin Cities
Want a daily digest of the top Tampa Bay news?
Get a daily digest of the most important stories affecting your hometown with Axios Tampa Bay
Want a daily digest of the top Charlotte news?
Get a daily digest of the most important stories affecting your hometown with Axios Charlotte
Illustration: Sarah Grillo/Axios
Self-driving cars can be programmed to stay in their lane and obey speed limits. What they lack is human intuition — the ability to know what's going on inside someone else's head.
Why it matters: Autonomous vehicles are getting closer to reality but if they're ever going to drive better than people they need to learn how to share the road with other cars, pedestrians and cyclists. Turning them into social creatures with human instincts is arguably the biggest roadblock to autonomy.
Human drivers react to social cues all the time — and not just a wave of the hand or a nod from a pedestrian signaling for them to go ahead. People give off hundreds of signals that others use to understand their state of mind and respond accordingly.
- A person standing in a crosswalk looking at their phone is probably distracted, for example, so the driver knows to yield.
- A group of kids playing around on the sidewalk could tumble into the street, so caution is advised.
- A motorcyclist at an intersection touches their feet to the ground, so it's probably safe to proceed.
Be smart: AVs need the same instincts so they can recognize, understand, and predict human behavior. But that ability, which comes so naturally to humans, is hard for machines to learn.
Driving the news: Perceptive Automata, a startup whose backers include Toyota and Hyundai, has a unique approach — it's applying behavioral science techniques to machine learning.
- Instead of training AI models by labeling objects (this is a tree; this is not a tree, for example), Perceptive Automata characterizes the way people understand others' state of mind and then trains its algorithms to recognize and predict human behavior.
How it works:
- Researchers collect sensor data from vehicles interacting with pedestrians, bicyclists, and other motorists and then chop up the images into smaller slices.
- Then they blur or cover a portion of each slice and ask groups of people what the depicted person is about to do.
- They repeat the process hundreds of thousands of times, with a variety of interactions, and use the results to train models that interpret the world the way people do.
Humans don't always agree, of course — predicting human behavior is not as easy as labeling a tree — but capturing that ambiguity is important to program how an AV should behave, CEO Sid Misra says.
Yes, but: That also means self-driving cars could be prone to mistakes, notes Elizabeth Walshe, a cognitive neuroscientist who studies driving behavior at the Children's Hospital of Philadelphia and the University of Pennsylvania.
- "Humans are not perfect. Are we happy with machines that make mistakes and are not perfect?"
The bottom line: It will take time for people to learn to trust self-driving cars, but understanding what they might be thinking is a good place to start.
Go deeper: Cars of the future need to be able to heal themselves