Sep 12, 2017 - Energy & Environment

Tesla's safeguards 'lacking' in self-driving car crash

A Tesla Model S crashed while driving in self-driving mode in Florida on May 7, 2016. Photo: Florida Highway Patrol via AP

Tesla allowed a driver to use automated controls outside of conditions for which they were intended, leading to a car crash that "never should have happened," the National Transportation Safety Board concluded from its investigation of a 2016 crash involving a Tesla partially-automated vehicle. The driver of the vehicle was also overly reliant on automated features.

Why this matters: The crash caused the first known highway fatality in a vehicle operated with automated controls, and the investigation led to new recommendations for how automated-vehicle makers are held accountable in crashes. NTSB Chairman Robert Sumwalt stressed that the Level 2 automated systems available on the market today are intended to augment human drivers, not replace them, but automakers need to better educate drivers.

"The safety potential of self-driving cars is staggering," Sumwalt said. "But it's a long road from partially automated vehicles to self-driving cars. And until we get there, someone has to drive."

Details: Last year, a 2015 Tesla Model S was traveling 74 miles per hour (above the 65 mph speed limit) just before crashing into a semitractor-trailor. The Tesla was operated using automated vehicle control features. While there was sufficient time for either driver to have acted to avoid the crash, there is no indication that the driver interacted with the vehicle in the two minutes before the crash.

Investigation results:

  • NTSB said the probable cause of the crash was the truck driver's failure to yield to the Tesla, and the Tesla driver's over-reliance on automated features.
  • The Tesla Autopilot system functioned as it was designed, but it allowed the car to operate outside of the environment for which it was intended. For example, the Autosteer function is only intended for use on highways and limited-access roads. The crash, however, happened on a 4-lane road that was not limited-access.
  • Autopilot did not detect the truck that turned in front of the vehicle. NTSB said the collision-mitigation systems are not designed to detect crossing and turning traffic, as the truck had done.
  • Although the truck driver had been found to have used marijuana prior to the crash, NTSB was unable to determine impairment, if any, at the time of the crash.

Among the recommendations:

  • Automated vehicles should automatically detect the types of roads it was designed for, and therefore only allow automated features to operate under those conditions.
  • Automakers need to closely monitor and share data collected about incidents that occur in their automated vehicles, and they need to capture more details about the vehicle control systems during crashes.
  • Tesla Autopilot has torque sensors that detect whether a driver is holding the steering wheel to indicate whether the driver is paying attention. NTSB recommends a more effective method for monitoring driver engagement, such as eye tracker technology.

Bottom line: There are still significant limitations to self-driving technology, so drivers still have to be attentive when driving partially-autonomous vehicles. And car-makers have work to do to ensure automated-vehicles of any level account for all the hazards posed by real-world driving.

Go deeper