Tesla's safeguards 'lacking' in self-driving car crash

A Tesla Model S crashed while driving in self-driving mode in Florida on May 7, 2016. Photo: Florida Highway Patrol via AP

Tesla allowed a driver to use automated controls outside of conditions for which they were intended, leading to a car crash that "never should have happened," the National Transportation Safety Board concluded from its investigation of a 2016 crash involving a Tesla partially-automated vehicle. The driver of the vehicle was also overly reliant on automated features.

Why this matters: The crash caused the first known highway fatality in a vehicle operated with automated controls, and the investigation led to new recommendations for how automated-vehicle makers are held accountable in crashes. NTSB Chairman Robert Sumwalt stressed that the Level 2 automated systems available on the market today are intended to augment human drivers, not replace them, but automakers need to better educate drivers.

"The safety potential of self-driving cars is staggering," Sumwalt said. "But it's a long road from partially automated vehicles to self-driving cars. And until we get there, someone has to drive."