
Uber app. Photo: Pau Barrena / AFP via Getty Images
It's finally happened. A self-driving car has hit and killed a pedestrian, a scenario everyone in and out of the industry knew would eventually happen. Uber, which operated the car in question, has put on hold its self-driving car effort in three cities and the NTSB is investigating.
So now what? This leads to a multitude of questions that need answers — plus the warning that there are two big mistakes that we could make as a society in dealing with self-driving cars.
Watch out for these two potential mistakes:
- The first would be not to recognize that giving the keys to a machine is a big deal and to think we can let the industry proceed without any guidelines or regulations.
- The second, and in my opinion more dangerous, would be to dramatically slow development of self-driving car technology just because it isn't yet 100 percent perfect. Human drivers cars kill lots of people.
The real challenge, then, is to set some sort of safety standard for determining when the technology is "good enough." How much more should we demand of autonomous vehicles — should they be twice as good as human drivers, 10x better, 100x?
First, let's determine how safe are self-driving cars. Right now, it's hard to say since so few self-driving cars are on the road. Zendrive CEO Jonathan Matus told Axios:
"If human drivers tend to see a fatal crash every 100 million miles driven, then total of miles driven by all AV training efforts, combined to date (estimated less than 10-20 million) is still pretty low to even begin to get a sense for how safe they are relative to humans."
"A fatality so early certainly is a sad and a bad sign, but the total miles driven are so low and these are such rare events that tough to get a sense of how safe or unsafe the tech is compared to humans."
Yellow light: There were lots of calls on Monday to slow down, including from the head of the well-regarded self-driving car lab at Carnegie Mellon. Opposition could kill an already stalled bill in Congress that would have allowed testing of self-driving cars to proceed with minimal oversight from the NHTSA.
- "This crash should be a clear wake-up call for Congress to halt this flawed legislation and add desperately-needed minimum performance requirements and safety standards," said Cathy Chase, president of Advocates for Highway and Auto Safety.
- "The death of this pedestrian simply is a red flag, a sign that we should slow down and make sure that safety is given priority," Sen. Richard Blumenthal (D-Conn.) told reporters on Monday. Asked whether it would change the likelihood of a floor vote "It should change the calculus, it should change the dynamic to make people more aware of how risky these vehicles potentially are. They are extraordinarily dangerous at this point. Moving them to the roads hastily or recklessly could be the consequence of this bill."
Worth remembering: Human-driven cars kill more than 30,000 people each year and humans are also responsible for most crashes that involve self-driving cars.
Who pays? And then, of course, in our litigious society there is also the question of who is liable.
- There are a few options, but it seems like the wisest might be to make the creators of the self-driving car technology liable (barring some type of operator negligence) — that way they have a financial incentive to make the safest possible machines.
- It's another variant of this sci-fi question: Who do you hold responsible when well-intentioned AI goes bad.