
Illustration: Rebecca Zisser/Axios
Simulation has been critical to speeding up AV development, especially for motion planning and control algorithms. But in these simulations, vehicles are either not perceiving the simulated world at all or perceiving only scenarios that have been previously encountered on the road.
Why it matters: Using these approaches, vast amounts of on-road data will need to be gathered in order to reach the edges of a perception module's capabilities. An additional layer of simulation for AV sensors and perception systems could use synthetic data to accelerate the development of AVs.
Background: AV simulations test different modules of the self-driving software using map data and information about obstacles on the road.
- In re-simulations, saved data from on-road drives can be fed back through the software. AV engineers can develop test scenarios to benchmark software performance and then compare the results after making updates.
What's needed: Simulating sensor models as well could help in several areas.
- Optimizing sensor placement on vehicles. Being able to position and visualize the output from sensors is easier and faster using software than real hardware.
- Creating data sets to train for rare situations. Simulations can model any imaginable scenario, however unlikely.
- Feeding perception algorithms with synthetic sensor data. This allows engineers to continuously create new scenarios and test their algorithms, rather than waiting for new on-road data.
The catch: Sensor simulation for perception algorithms is difficult because each sensor type (lidar, radar, camera, infrared) has different levels of complexity.
- In an ideal case, a real hardware sensor’s information could be reproduced using purely software. This requires modeling the underlying physics of each sensor and carefully constructing the 3D environment in which the synthetic sensors “operate.”
- Synthetic data could then be passed to perception algorithms in a form that they can’t differentiate from real data.
These simulation platforms have to be capable of creating and rendering large photorealistic environments in real time.
- Sensor tasks add to the load: Properly simulating radar or lidar can require casting millions of synthetic rays per second.
The bottom line: Sensor simulation for perception is still a new process, but could fill an important gap in the development cycle of AV software.
Chris Gundling is a senior software engineer at Applied Intuition, an AV simulation software company.