Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Stay on top of the latest market trends
Subscribe to Axios Markets for the latest market trends and economic insights. Sign up for free.
Sports news worthy of your time
Binge on the stats and stories that drive the sports world with Axios Sports. Sign up for free.
Tech news worthy of your time
Get our smart take on technology from the Valley and D.C. with Axios Login. Sign up for free.
Get the inside stories
Get an insider's guide to the new White House with Axios Sneak Peek. Sign up for free.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Want a daily digest of the top Denver news?
Get a daily digest of the most important stories affecting your hometown with Axios Denver
Want a daily digest of the top Des Moines news?
Get a daily digest of the most important stories affecting your hometown with Axios Des Moines
Want a daily digest of the top Twin Cities news?
Get a daily digest of the most important stories affecting your hometown with Axios Twin Cities
Want a daily digest of the top Tampa Bay news?
Get a daily digest of the most important stories affecting your hometown with Axios Tampa Bay
Want a daily digest of the top Charlotte news?
Get a daily digest of the most important stories affecting your hometown with Axios Charlotte
Illustration: Sarah Grillo/Axios
There's mounting evidence that people put too much trust in driver-assistance features like Tesla Autopilot, but federal regulators aren't doing enough to ensure the systems are deployed safely, experts say.
Why it matters: Nearly 37,000 Americans die each year in highway accidents. As automated features become more common, the roads could get more dangerous — not safer — if drivers use the technology in unintended ways.
Driving the news: The National Transportation Safety Board this week slammed Tesla and the federal government for failing to prevent "foreseeable abuse" of its Autopilot technology, which it found contributed to a fatal accident in California in 2018.
- The driver, an Apple engineer, was using a video game on his phone when his Tesla Model X, operating on Autopilot, steered itself into a highway barrier at 71 miles per hour, the NTSB investigation concluded.
- It was another example of a distracted Tesla driver being killed while using Autopilot, although NTSB members emphasized that drivers in any car equipped with similar technology could become complacent or distracted.
The agency issued nine safety recommendations, including the installation of driver monitoring systems as well as lock-out devices to prevent the use of cell phones while driving. And it said companies like Apple should adopt policies to prevent distracted driving by employees.
But its harshest criticism was reserved for Tesla and the National Highway Traffic Safety Administration.
- NTSB criticized Tesla for not restricting where Autopilot can be used, despite its known limitations, and said the company should evaluate the system to determine if it poses an "unreasonable risk to safety."
- The agency also called NHTSA's hands-off regulatory approach to driver-assistance technology "misguided" because the government is waiting for problems to occur rather than addressing safety issues proactively.
Yes, but: NTSB is an independent federal agency, with no enforcement powers. Tesla and NHTSA have ignored its recommendations in the past.
What they're saying: NHTSA issued a statement saying it would "carefully review" the NTSB findings, but noted that states hold drivers responsible for vehicle operations.
- "If NHTSA doesn’t want to get its hands around this, we’re only going to see more problems," said Jason Levine, executive director of the Center for Auto Safety, who called NHTSA's lack of leadership "the real villain" in this story.
The bottom line, from NTSB Chairman Robert Sumwalt:
“There is not a vehicle currently available to US consumers that is self-driving. Period. Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated."
Go deeper: Tesla safety probes bring scrutiny for regulators too