Good morning! Thanks for reading. Please share this newsletter and tell your friends they can subscribe here. If you have tips or feedback, just reply to this email.
1 big thing: There are no self-driving cars (yet)
Thanks to advances in robotics and artificial intelligence, humans are on the cusp of being removed from the driver seat. But as drivers are asked to do less, they are becoming more complacent — and complacency breeds danger.
Why it matters: Autonomous vehicles promise safer roads and more freedom for the poor, the elderly and the disabled. But they're not ready to drive themselves yet. Some people are relying too heavily on their car's automated features, resulting in avoidable crashes and dangerous incidents that threaten to undermine public confidence in self-driving cars.
What's happening: Safety features like blind-spot monitoring and backup cameras are giving way to more advanced driver-assist systems that can keep pace with the traffic flow and keep your car centered in its lane.
- The danger occurs as drivers become more comfortable with these convenience features and mistakenly believe their car can drive itself.
- The risk is compounded by misleading marketing names attached to the technologies — Tesla's Autopilot, Nissan's ProPilot Assist and Volvo's Pilot Assist, for example.
- The media also often describes the technology irresponsibly.
Even Tesla CEO Elon Musk is guilty of spreading misinformation on national TV: While demonstrating Tesla's new Navigate on Autopilot feature to "60 Minutes," he clasped his hands over his belly and told Lesley Stahl, "I'm not doing anything."
That's just wrong. Tesla repeatedly warns drivers — through on-screen warnings, driver manuals and public statements — that Autopilot is not an autonomous system and the driver must remain in control. Other automakers issue similar warnings. But many people aren't heeding the message.
- Investigations into 2 fatal Tesla crashes found that the humans behind the wheel were not paying attention or failed to respond to warnings to take control.
- A sleeping Tesla driver was able to cruise at 70 mph for 7 miles apparently on Autopilot before police stopped the car.
To address the confusion, SAE International, formerly called the Society of Automotive Engineers, is revising its definitions of the 6 levels of driving automation.
- They want to try to make it easier for people to distinguish between driver support features (like lane-centering and adaptive cruise control) and automated driving features (like traffic jam chauffeur or driverless taxis).
- The tricky and potentially dangerous features are those in-between systems (known as Level 3), which can drive the vehicle under limited conditions, but still require the driver to take over when needed.
- The problem: It can take 15 to 25 seconds for a zoned-out driver to regain control of the vehicle both physically and mentally, simulation studies show.
"Increasing autonomy might make driving boring but we're asking people to stay hyper alert in case you have to take over."— Elizabeth Walshe of the University of Pennsylvania and Children's Hospital of Philadelphia
The bottom line, from Jake Fisher, director of automotive testing at Consumer Reports: "Either you're riding in the car or you're driving the car. There’s a big difference. Just like you can't be semi-pregnant."
2. Redefining those autonomy standards
As mentioned above, SAE is updating its 2016 technical standards today, since their initial effort was confusing to many.
Why it matters: These standards allow engineers and automakers to speak the same language as they develop self-driving cars but they also communicate to consumers what to expect from more advanced driving systems and inform lawmakers trying to regulate AVs.
- Defining and communicating the capabilities and limitations of each level of autonomy is crucial for future safety.
Yes, but: When it comes to advanced technologies in a cutthroat industry where every company has its own agenda, finding common ground can be difficult.
- SAE has 560 technical committees creating standards for everything from advanced materials to brake linings.
- The committee tasked with defining levels of autonomy (called J3016) first convened 5 years ago and has been particularly contentious, insiders close to the process say.
- It includes representatives from major automakers, suppliers and AV tech players, all of whom have tried to shape the standards to favor their respective strategy.
In the end, the committee's 63 engineers agreed upon a new infographic that offers more “consumer-friendly” terms and definitions for each level, from zero automation to full automation.
- The chart answers "what does the human in the driver's seat have to do?" and offers examples of features at each level of automation and what they do.
- It will be shared with industry and government leaders as well as with the media.
- SAE's new language coincides with a series of public demonstrations of automated driving technology in Florida this week.
Like peace negotiations, setting engineering standards is a process of give and take, says Jack Pokrzywa, director of SAE Global Ground Vehicle Standards. "It is an amazing process that is somehow very painful."
3. Full AV rollout will require big 5G investment
Coming 5G networks should expedite advances in AV technology, but the required infrastructure investment is estimated at $130 billion to $150 billion — a significant hurdle that is holding AVs short of their potential, Raphael Gindrat, co-founder and CEO of Bestmile, writes for Axios Expert Voices.
The big picture: For AVs to be deployed, they'll need both onboard sensors to navigate their environment and 5G to talk to one another. But until there's a greater commitment to the technology's rollout, AVs will be developed without the benefits 5G could offer.
Background: 5G networks are expected to reduce latency — the delay between when information is sent and received — to 1 millisecond, a rate 50 times faster than 4G.
- 5G could also help with 360-degree vision and provide precise information about other vehicles' location, speed and intent.
Yes, but: 5G transmission runs out of steam faster than 4G, and in trials to date its range has been limited to 500–3,000 feet. As a result, 5G may require up to 400,000 antennas across the U.S., more than twice the number of cell towers needed to support 4G networks.
- AT&T, Verizon, Sprint and T-Mobile all have plans to offer 5G service for cellphones in the first half of 2019, but a rollout sufficient to support AVs is still further off.
- It'll likely be in Europe first, as the 5G Automotive Association predicts, since its standards bodies are ahead of other regions.
What to watch: With 5G poised to expedite autonomous transportation, strategic investments will need to come from governments, carriers and chipmakers well in advance of significant revenue.
4. Driving the conversation
More Uber fallout: A company manager raised concerns about the self-driving program just days before a fatal collision. (Jon Porter — The Verge)
- Details: In an email to Uber execs, Robbie Miller, an operations manager from the autonomous group, wrote that accidents were usually caused by “poor behavior of the operator” in control of the vehicle. He also noted that “several of the drivers appear to not have been properly vetted or trained.”
- The big picture: Adding to previous reporting from Business Insider about Uber's woes, Miller’s complaints suggest that Uber’s self-driving fleet had expanded far too quickly for the company to safely handle.
Electric surge: Daimler's $23 billion should buy a lot of EV batteries. (Sebastian Blanco — Forbes)
- Why it matters: Between Mercedes, BMW and Volkswagen, Germany's automakers are going all-in on electric and autonomous vehicles. Diesels? What diesels?
Scooter fade: Investor frenzy for scooter startups is cooling down. (Eliot Brown, Greg Bensinger and Katie Roof — The Wall Street Journal)
- My thought bubble: A better investment might be automated trikes for senior citizens.
5. Spreading liability risks across AV supply chain
AV crashes will present automakers with complicated new questions about liability. To prepare for them, they will need to rethink their warranties and indemnities with suppliers, Jeff Soble, a partner at Foley & Lardner, writes for Axios.
Why it matters: An AV accident could be caused by anything from a programming bug to a mechanical failure. Given this complexity, automakers will need contracts that protect them from assuming liability for integrated software and hardware from their suppliers.
Background: Most supplier contracts protect automakers with warranties against defects in materials or workmanship. However, those may not offer much protection in the case of AVs, where the product under contract is likely to be software, or a part so deeply integrated that it's hard to say where its role ends and another's begins.
- Warranties will have to be tailored to the output and performance of software, rather than merely hardware.
- Indemnification provisions will have to address situations in which automakers integrate a supplier’s product with other parts and unifying software. This will require broader indemnification for system failures and assessments of the impact of one software program or product on the whole system.
- Insurance can be obtained by both automakers and suppliers, who can also add one another as additional insureds.
The bottom line: By adapting to these legal challenges now, in both supplier contracts and insurance needs, companies can better allocate risks across the supply chain and protect consumers from being left empty-handed if their AV causes an accident.
6. 1 road rage thing
Some people in Chandler, Arizona, apparently don't like Waymo testing its self-driving minivans in their neighborhood. And, they're taking out their anger by throwing rocks, slashing tires and trying to force the vans off the road.
There have been at least 21 reports to Chandler police during the past 2 years of people harassing Waymo's AVs and their human test drivers, Ryan Randazzo reports in the Arizona Republic.
- One man brandished a handgun at a Waymo safety driver as the vehicle passed his driveway.
- Someone slashed the tire of a Waymo van while it was stopped in traffic.
- A Jeep driver tried to run Waymo vans off the road on a half dozen occasions.
What's behind the road rage? Maybe it's frustration over the slow-poke vans clogging their streets, or perhaps residents are worried robots will one day take their jobs, the paper speculates.
In any case, Waymo appears to be taking it in stride. Test drivers rarely pursue charges and arrests are rare. Instead they contact Waymo's dispatcher, who advises them to avoid the neighborhood next time.