Drivers have "poor understanding" of limits of self-driving car technology, IIHS finds

- Nathan Bomey, author ofAxios Closer

Drivers are putting too much faith in systems designed only to help them navigate the road, according to an automotive safety group.
Driving the news: The Insurance Institute for Highway Safety reports that 53% of users of Cadillac's hands-free Super Cruise are comfortable treating it like a full self-driving system, while 42% of users of Tesla's semi-autonomous Autopilot system do the same.
- Both systems are designed to drive the vehicle on their own in certain conditions — but the driver must be ready to take control of the wheel at any time if the vehicle can't handle road conditions.
Threat level: "The big-picture message here is that the early adopters of these systems still have a poor understanding of the technology’s limits,” IIHS President David Harkey said in a statement.
- He added that "system design and marketing" may be contributing to the "misconceptions."
The other side: GM has noted that eye-tracking technology requires drivers to keep their eyes on the road while drivers are using Super Cruise.
- "GM believes driver engagement is critical and required to operate any advanced driver assistance system in any vehicle we sell," GM spokesman Philip Lienert said in an email, adding that "when the system detects the driver isn’t paying attention, a series of escalations will prompt the driver to reengage."
- Tesla did not respond to a request seeking comment, but CEO Elon Musk has repeatedly argued that Autopilot is safer than human drivers.
Be smart: "None of the current systems is designed to replace a human driver or to make it safe for a driver to perform other activities that take their focus away from the road," IIHS reports.
- "Track tests and real-world crashes have provided ample evidence that today’s partial automation systems struggle to recognize and react to many common driving situations and road features."