Good morning! Thanks for reading. Please share this newsletter and tell your friends they can subscribe here. If you have tips or feedback just reply to this email.
Today Expert Voices contributor Chris Blumenberg offers strategies to divert shared AV fleets away from congested areas.
D.C. readers: You're invited to Mind the Skills Gap tomorrow at 8am. Join Axios' Kim Hart for a look into the role that policy, business and education leaders play in offsetting the skills gap. RSVP
1 big thing: Tesla and regulators scrutinized
Federal authorities leave it up to automakers to assess the safety of their own automated driving systems, but mounting investigations into Tesla crashes suggest regulators need to get tougher.
Why it matters: Tesla cars cannot drive themselves, but some owners are too trusting of their car's Autopilot assisted-driving technology and fail to stay alert.
- If the government finds Teslas are more prone to crashes than other vehicles with similar systems, it could determine Autopilot has a defect that poses "an unreasonable risk to safety" and order the company to conduct a recall.
What's happening: Two federal agencies are investigating a March 1 crash in which a man died when his Tesla Model 3 drove beneath a semitrailer that was crossing a Florida roadway near Delray Beach.
- Investigators for the National Transportation Safety Board and the National Highway Traffic Safety Administration want to find out whether Autopilot was engaged at the time of the crash.
- The accident is eerily similar to one that occurred in May 2016 near Gainesville, Florida.
- A Tesla spokesperson had no comment on the most recent incident.
Plus, Tesla is under the microscope for at least three other crashes in Florida and California.
What they're saying:
- "It seems like Tesla is the one car company that’s having troubles like this, despite the fact that they are not the only company with similar technology on the road," says former acting NHTSA Administrator David Friedman, now VP of advocacy for Consumer Reports.
- A NHTSA spokesperson confirmed there are no active defect investigations regarding other manufacturers' vehicles with automated driving technology.
- Tesla says its vehicles "are engineered to be the safest on the road," and it posts quarterly updates on its safety record, per its website.
Between the lines: NHTSA's own guidance says an unreasonable risk to safety may occur if a manufacturer fails to account for "any foreseeable misuse" of their technology by a driver who is distracted or inattentive.
Yes, but: NHTSA found no defect when it investigated the 2016 fatal crash in Florida.
- NTSB, on the other hand, found that Autopilot's design had contributed to the crash by allowing the driver to activate the system even on roads where it wasn't designed to operate safely and by failing to have a driver monitoring system to ensure he was alert.
- The catch: NTSB is an independent agency with no regulatory power.
- The power rests with NHTSA, but sources tell us the agency, currently under acting Administrator Heidi King, is coping with management churn and has been accused by consumer advocates of being too cozy with the industry.
"The agency responsible for seeing that this technology is shepherded safely has totally abdicated its responsibility to industry."— Jason Levine, executive director, Center for Auto Safety
What to watch: The facts of Tesla's crashes are still under investigation, but with the cases piling up, NHTSA could be under pressure to act.
Go deeper: Read the full story here.
2. Tesla's image tumbles in new survey
Tesla's public reputation took a big hit in 2018, a tumultuous year for the electric automaker marked by a series of controversial comments by CEO Elon Musk, a new Axios-Harris Poll survey shows, my colleague Ben Geman writes.
Why it matters: The drop in Tesla's ranking from number 3 to number 42 in the annual survey — the second-biggest decline after Facebook — shows the risks for companies with identities so closely aligned with one person.
"This is sort of a cautionary tale for when your CEO is a celebrity."— John Gerzema, CEO, The Harris Poll
The big picture: Musk's wild 2018 included...
- Baseless allegations of pedophilia against a man who helped rescue the Thai boys' soccer team.
- His "funding secured" tweet about the now-aborted plan to take Tesla private, which drew a complaint from securities regulators that Musk settled.
- Taking a drag from a cigarette with marijuana on "The Joe Rogan Experience."
By the numbers: Beyond the overall slide in the annual survey of the country's 100 most visible companies, Tesla also saw erosion in several of its metrics...
- "Character" ranking slid from 7 to 57.
- "Trust" ranking fell from 14 to 46.
- "Ethics" ranking dropped from 5 to 56.
- "Vision" went from 1 to 39.
But, but, but: Despite the tumble, Tesla's overall score remains higher than Ford and GM (see chart above).
Read more of Ben's full story on our key findings and the methodology for the Axios Harris Poll 100.
3. Routing software could help ease congestion, even before AVs
Cities are souring on the controlled chaos of current ride-hailing systems, and fleet management systems could work in concert with policy and infrastructure to mitigate the issue, Chris Blumenberg writes for Axios Expert Voices.
Why it matters: If AVs make ride-hailing cheaper and more efficient, city traffic might slow even further — but fleet management systems could help cities manage the challenge, ideally before AVs are widely deployed.
- Inefficiencies in routing and communication mean vehicles are idling in city centers and taking congested paths between destinations.
- Ride-hailing was responsible for half of the slowdown in San Francisco's traffic from 2010 to 2016, its transportation authority estimates. A new Transport Policy study found that the deployment of 2,000 AVs in downtown San Francisco could bring traffic speeds down to 2 miles per hour.
Fleet management software could route vehicles around dense, highly trafficked areas on alternative routes.
- AVs in particular will need to be routed around areas with heavy traffic from the get-go, because once in traffic, they will likely remain on their programmed route.
- Software like the Transportation Mobility Cloud, Fleet Planner, and Auro can direct vehicles, and can intentionally disperse them to alleviate traffic on popular routes.
Yes, but: Fleet management technology can't be deployed at scale without more accurate maps, directions that factor in a vehicle's capabilities, and idle vehicle management.
Go deeper: Read the full post.
Blumenberg is the co-founder and CTO of rideOS, which is developing transportation planning and mapping technologies.
4. Driving the conversation
- Where it stands: Prosecutors said there was “no basis for criminal liability for the Uber corporation” but said investigators should continue looking at the safety driver's actions. The crash led to an overhaul of Uber's testing procedures for self-driving vehicles.
Imitation: Is Tesla following Old GM's playbook? (Nick Bunkley — Automotive News)
- My thought bubble: Bunkley builds a convincing case for why Tesla is looking more and more like the bumbling automakers it once sought to disrupt.
Speed kills: Volvo to limit its cars' top speed to 112 mph (Steven Ewing — CNET)
- Why it matters: The move is part of the company's "Vision 2020" safety initiative, which aims for no one to be killed or seriously injured in a new Volvo by 2020.
- The company is even considering using “smart speed control and geofencing” to force cars to slow down in school zones and near hospitals.
Backlash: Tesla's promise of 'full-self-driving' angers autonomous vehicle experts (Matt McFarland — CNN)
- My thought bubble: Tesla continues to oversell the capabilities of its assisted-driving technology, which only breeds confusion among consumers. Regulators need to hold all companies to the same standards to ensure the technology is safely introduced.
5. 1 big problem
A new study suggests black people are more likely to get hit by an autonomous vehicle than white people, Vox writes.
Why it matters: The findings are the latest example of how human bias seeps into artificial intelligence. If AVs are trained with data that includes only light-skinned people as examples of what constitutes a “human,” they won't recognize dark-skinned people as also “human” in the real world.
Details: The study, by researchers at the Georgia Institute of Technology, tried to determine how accurately state-of-the-art object-detection models, like those used by self-driving cars, detect people from different demographic groups, Vox explains.
- Researchers divided a large dataset of images that contain pedestrians by skin tone.
- Then they compared how often the AI models correctly detected the presence of people in the light-skinned group versus how often they got it right with people in the dark-skinned group.
- Detection of dark-skinned people was 5 percentage points less accurate.
The bottom line: AI, including that in AVs, can be just as biased as their creators and this needs to be addressed.
- Samantha Huang, a senior associate at BMW iVentures, wrote about the problem last fall, after observing while riding in the back of an AV test vehicle that it failed to detect 2 pedestrians who were black.
- Had these engineers come from more racially diverse backgrounds, she wrote, they probably would have been less likely to plug in only images of light-skinned people into their algorithms.