Sep 12, 2017

Tesla's safeguards 'lacking' in self-driving car crash

Kim Hart, author of Cities

A Tesla Model S crashed while driving in self-driving mode in Florida on May 7, 2016. Photo: Florida Highway Patrol via AP

Tesla allowed a driver to use automated controls outside of conditions for which they were intended, leading to a car crash that "never should have happened," the National Transportation Safety Board concluded from its investigation of a 2016 crash involving a Tesla partially-automated vehicle. The driver of the vehicle was also overly reliant on automated features.

Why this matters: The crash caused the first known highway fatality in a vehicle operated with automated controls, and the investigation led to new recommendations for how automated-vehicle makers are held accountable in crashes. NTSB Chairman Robert Sumwalt stressed that the Level 2 automated systems available on the market today are intended to augment human drivers, not replace them, but automakers need to better educate drivers.

"The safety potential of self-driving cars is staggering," Sumwalt said. "But it's a long road from partially automated vehicles to self-driving cars. And until we get there, someone has to drive."

Details: Last year, a 2015 Tesla Model S was traveling 74 miles per hour (above the 65 mph speed limit) just before crashing into a semitractor-trailor. The Tesla was operated using automated vehicle control features. While there was sufficient time for either driver to have acted to avoid the crash, there is no indication that the driver interacted with the vehicle in the two minutes before the crash.

Investigation results:

  • NTSB said the probable cause of the crash was the truck driver's failure to yield to the Tesla, and the Tesla driver's over-reliance on automated features.
  • The Tesla Autopilot system functioned as it was designed, but it allowed the car to operate outside of the environment for which it was intended. For example, the Autosteer function is only intended for use on highways and limited-access roads. The crash, however, happened on a 4-lane road that was not limited-access.
  • Autopilot did not detect the truck that turned in front of the vehicle. NTSB said the collision-mitigation systems are not designed to detect crossing and turning traffic, as the truck had done.
  • Although the truck driver had been found to have used marijuana prior to the crash, NTSB was unable to determine impairment, if any, at the time of the crash.

Among the recommendations:

  • Automated vehicles should automatically detect the types of roads it was designed for, and therefore only allow automated features to operate under those conditions.
  • Automakers need to closely monitor and share data collected about incidents that occur in their automated vehicles, and they need to capture more details about the vehicle control systems during crashes.
  • Tesla Autopilot has torque sensors that detect whether a driver is holding the steering wheel to indicate whether the driver is paying attention. NTSB recommends a more effective method for monitoring driver engagement, such as eye tracker technology.

Bottom line: There are still significant limitations to self-driving technology, so drivers still have to be attentive when driving partially-autonomous vehicles. And car-makers have work to do to ensure automated-vehicles of any level account for all the hazards posed by real-world driving.

Go deeper

HBCUs are missing from the discussion on venture capital's diversity

Illustration: Eniola Odetunde/Axios

Venture capital is beginning a belated conversation about its dearth of black investors and support of black founders, but hasn't yet turned its attention to the trivial participation of historically black colleges and universities (HBCUs) as limited partners in funds.

Why it matters: This increases educational and economic inequality, as the vast majority of VC profits go to limited partners.

Unemployment rate falls to 13.3% in May

Data: Bureau of Labor Statistics; Chart: Axios Visuals

The U.S. unemployment rate fell to 13.3% in May, with 2.5 million jobs gained, the government said on Friday.

Why it matters: The far better-than-expected numbers show a surprising improvement in the job market, which has been devastated by the coronavirus pandemic.

The difficulty of calculating the real unemployment rate

Data: U.S. Department of Labor; Note: Initial traditional state claims from the weeks of May 23 and 30, continuing traditional claims from May 23. Initial PUA claims from May 16, 23, and 30, continuing PUA and other programs from May 16; Chart: Andrew Witherspoon/Axios

The shocking May jobs report — with a decline in the unemployment rate to 13.3% and more than 2 million jobs added — destroyed expectations of a much worse economic picture.

Why it matters: Traditional economic reports have failed to keep up with the devastation of the coronavirus pandemic and have made it nearly impossible for researchers to determine the state of the U.S. labor market or the economy.