Helping autonomous vehicles develop a "sixth sense"
- Joann Muller, author of Axios What's Next

Illustration: Aïda Amer/Axios
Today's autonomous vehicles use a combination of cameras and other sensors to try to replicate human perception, but they still struggle, especially in bad driving conditions. New technologies under development could help fill the gaps by letting vehicles "see" and "feel" things from different perspectives.
Why it matters: The race to develop fully self-driving cars has hit a series of hurdles, with many industry experts now predicting it could be a decade or longer before their full-scale deployment.
- That has inspired dozens of startups in lidar and other technologies in pursuit of a breakthrough they hope will finally make AVs possible.
Where it stands: Most AVs currently in development rely on a combination of 3 or 4 key sensor technologies, plus high-definition maps, to try to replicate the way human drivers perceive the world. Each has its own strengths and weaknesses...
- High-def cameras are good at identifying objects, but struggle in poor visibility like rain or fog.
- Radar can detect objects far away, even in bad weather, but not very clearly.
- Lidar uses laser beams to create a 3D image of the car's environment, but struggles under certain lighting conditions.
- Ultrasonic sensors can judge objects, but only at close distances.
What’s happening now: Startup companies are working to add additional layers of perception to help fill the gaps in AV perception.
WaveSense specializes in ground-penetrating radar that creates subterranean maps to help vehicles navigate through snow, rain and fog.
- The MIT spinoff essentially creates a fingerprint of the roadways by mapping and tracking unique geologic patterns underground.
- Above ground, the landscape is constantly changing. Below the surface, maps are static, which helps AVs see the road, even in situations like a driving snowstorm.
- Originally targeted for military purposes, WaveSense's technology is now being tested by several AV developers and suppliers, and CEO Tarik Bolat tells Axios he expects to announce production partners later this year.
Tactile Mobility's software helps cars "feel" the road by generating real-time data about the physical factors like traction and pressure that impact the ride then feeding it into proprietary algorithms to optimize the car's performance.
- Human drivers feel how a car handles through their seat and steering wheel. It's what enables us to steer out of a spin on icy pavement, for example, or recognize when we're hydroplaning.
- The Israeli startup wants AVs to "feel" the bumps, curves and potholes under their tires, just as humans do.
- Ford is among the companies currently testing the company's software.
- The cloud-based data can also be valuable to municipalities and fleet managers by identifying up-to-the-minute road conditions, CEO Amit Nisenbaum tells Axios.
But, but, but: As more sensors and software are added to an already complex machine, the number of potential failure modes increases exponentially, making AV systems more challenging to validate, says Sam Abuelsamid, a senior research analyst at Navigant Research.
The bottom line: Redundant sensing systems could ultimately lead to safer systems, but meticulous work remains to determine how to perfect them.
Go deeper: The true cost of autonomous vehicles