Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Denver news in your inbox
Catch up on the most important stories affecting your hometown with Axios Denver
Des Moines news in your inbox
Catch up on the most important stories affecting your hometown with Axios Des Moines
Minneapolis-St. Paul news in your inbox
Catch up on the most important stories affecting your hometown with Axios Twin Cities
Tampa Bay news in your inbox
Catch up on the most important stories affecting your hometown with Axios Tampa Bay
Charlotte news in your inbox
Catch up on the most important stories affecting your hometown with Axios Charlotte
Lidar, sensing what's in its path. Screenshot: Vayavision
Most driverless vehicles rely on a clutch of sensors — radar, cameras and especially a 3D, 360-degree viewing technology called lidar, which is that big spinning thing you see atop test vehicles.
Why it matters: A big problem, in addition to making the sensors better and cheaper, has been unifying all their feeds into a single stream of information and acting reliably on what they indicate. If they could be fused in a sensible way, that could compensate for flaws in the individual sensors.
Ronny Cohen, CEO of Israel-based VayaVision, is gathering together the feeds of all the sensors and determining — based on machine learning — what challenges surround the vehicle or are coming ahead.
- "We combine all of them into a unified image. We want to know where each pixel is in its space and its velocity," Cohen tells Axios.