It's finally happened. A self-driving car has hit and killed a pedestrian, a scenario everyone in and out of the industry knew would eventually happen. Uber, which operated the car in question, has put on hold its self-driving car effort in three cities and the NTSB is investigating.
So now what? There are two big mistakes that we could make as a society in dealing with self-driving cars.
- The first would be not to recognize that giving the keys to a machine is a big deal and to think we can let the industry proceed without any guidelines or regulations.
- The second, and in my opinion more dangerous, would be to dramatically slow the development of self-driving car technology just because it isn't yet 100 percent perfect. Human drivers kill lots of people.
The real challenge, then, is to set some sort of safety standard for determining when the technology is "good enough." How much more should we demand of autonomous vehicles — should they be twice as good as human drivers, 10x better, 100x?
Yellow light: There were lots of calls on Monday to slow down, including from the head of the well-regarded self-driving car lab at Carnegie Mellon. Opposition could kill an already stalled bill in Congress that would have allowed testing of self-driving cars to proceed with minimal oversight from the NHTSA.
- "This crash should be a clear wake-up call for Congress to halt this flawed legislation and add desperately-needed minimum performance requirements and safety standards," said Cathy Chase, president of Advocates for Highway and Auto Safety.
- Sen. Richard Blumenthal (D-Conn.) told reporters on Monday that "[i]t should change the calculus, it should change the dynamic to make people more aware of how risky these vehicles potentially are. They are extraordinarily dangerous at this point. Moving them to the roads hastily or recklessly could be the consequence of this bill."
Read the full story here.