Aurora offers an open book test for self-driving cars
If you can pass a driver's test, you can get an operator's license. But there is no corresponding test for autonomous vehicles.
Why it matters: Unless Congress acts, it'll be up to tech companies and carmakers — not the government — to determine when self-driving cars are safe for public roads. "Just trust us" isn't a viable answer to earn public acceptance.
What's happening: One self-driving tech company, Aurora, argues that publicly sharing its work — through a series of layered safety claims along with detailed evidence to back up each one — is the best way to determine when the technology is safe.
- This "safety case framework" is a structured argument that gives engineers a roadmap for developing the tech while also offering much-needed transparency to the public.
- "It's like saying you're going to climb a mountain, but you don’t know how high the mountain is or how many steps it will take to get there," explains Nat Beuse, Aurora's vice president of safety. "The safety case tells us how high it is and how many steps it will take to make the ascent."
Between the lines: The approach is also more meaningful, Beuse says, than other proxies for AV safety, such as counting how many times a backup safety driver had to take control during testing (California's so-called "disengagement reports") or how many millions of road miles an AV developer logs (the basis for Waymo's leadership claim).
Of note: Beuse, a former official at the U.S. Department of Transportation, was instrumental in establishing a new approach toward safety at Uber's autonomous vehicle unit after one of its self-driving cars killed a pedestrian in 2018.
- Aurora acquired the Uber unit in January.
- Other industries, including aviation, nuclear and medical, also use a safety case-based approach to assess their performance.