Apr 26, 2024 - Business

Drivers using Tesla Autopilot crashed hundreds of times, federal investigators say

the dashboard of a car seen through a steering wheel

The dashboard of the software-updated Tesla Model S P90D shows the icons enabling Tesla's Autopilot system. Photo: Chris Walker/Chicago Tribune/Tribune News Service via Getty Images

Federal auto safety investigators say in a new report that they've identified hundreds of crashes — 13 of them deadly — in which Tesla's Autopilot system failed to protect its drivers and passengers.

Between the lines: Tesla has faced scrutiny for years over how it describes the system's capability, with critics saying Autopilot promises more than it can deliver.

Driving the news: The National Highway Traffic Safety Administration's Office of Defects Investigations (ODI) flagged 467 crashes over a period of about 15 months from 2022-23.

  • The incidents resulted in 54 injuries and 14 deaths, ODI said.

Zoom in: The investigators said they identified three types of crashes:

  • 211 "collisions in which the frontal plane of the Tesla struck another vehicle or obstacle with adequate time for an attentive driver to respond to avoid or mitigate the crash."
  • 145 "roadway departures in low traction conditions such as wet roadways."
  • 111 "roadway departures where Autosteer" — Autopilot's automatic steering system — "was inadvertently disengaged by the driver's inputs."

Many drivers put too much faith in the system, ODI concluded, as "Autopilot controls did not sufficiently ensure driver attention and appropriate use."

  • "This mismatch of weak usage controls and high control authority was evident in these crash categories, which included indications of driver disengagement from the driving task," the investigators said in their report.

Our thought bubble, via Axios editor Alex Fitzpatrick: NHTSA's findings provide fuel for watchdogs who say semi-autonomy is less safe than driving by hand, because people don't understand the limitations or don't adequately monitor the system.

The other side: Tesla and CEO Elon Musk did not immediately respond to requests for comment, but Musk has repeatedly defended Autopilot, saying it's safer than human drivers.

What we're watching: Tesla announced a recall in December that involved the delivery of a software update in response to NHTSA's probe — but agency regulators are now investigating the effectiveness of that fix.

  • They have "concerns due to post-remedy crash events and results from preliminary NHTSA tests of remedied vehicles," according to documents released Friday.

The bottom line: There was "a critical safety gap between drivers' expectations" of Autopilot's capability "and the system's true capabilities," ODI said.

Go deeper