Seeking a common language for self-driving cars
- Joann Muller, author of Axios What's Next

Zoox is developing sound patterns that can be pinpointed toward individuals to indicate the intentions of its robotaxi. Photo: Zoox
Autonomous vehicle companies are exploring the use of a common language — standardized light patterns or sounds — that would help driverless cars communicate their intentions to humans.
Why it matters: Autonomous vehicles will share the road with human-driven vehicles, pedestrians and cyclists for a long time. The development of a standard communications method could build trust and reduce traffic accidents.
- Unlike today's drivers, AVs can't make eye contact with other road users or gesture to indicate it’s OK to cross the road.
What's happening: Argo.ai, the developer of a self-driving system, is urging fellow developers to adopt its newly released technical guidelines for safe interactions between robocars and bicyclists.
- The guidelines, created in collaboration with the League of American Bicyclists, urge AV companies to incorporate bike lanes into their AI maps, for example, and to model typical cyclist behavior — like lane-splitting or swerving around an open car door — into their algorithms.
- Self-driving cars should also be programmed to slow down and create extra space when it's unclear what a cyclist might do, per Argo's guidelines.
- "Roads have gotten significantly less safe for people outside of vehicles in the last decade," Ken McLeod, policy director at the League of American Bicyclists, said in a statement.
- "By addressing interactions with bicyclists now, Argo is demonstrating a commitment to the role of automated technology in reversing that deadly trend.”
Meanwhile, companies are also trying to find a common language for self-driving vehicles.
- Some companies are promoting what they see as best practices through voluntary safety self-assessments of their self-driving technology that they've filed with the National Highway Traffic Safety Administration.
- Ford Motor's safety report, for example, describes a white light bar mounted near the top of the windshield where a pedestrian or cyclist may look for cues from a human driver.
- Ford worked with researchers at Virginia Tech Transportation Institute to develop different signal patterns to indicate an AV is doing a pickup or drop-off, for example, or recognizes another road user by tracking and following their movement.
- Ford says it is leading an initiative to create a standard method for external visual communication.
Zoox, meanwhile, which makes a custom robotaxi, is testing its own communication patterns using a variety of lights incorporated into its design.
- The vehicle, which has no front or back, also features 32 speakers that can pinpoint sound in a specific direction to communicate with other road users.
- "We can be more intentional in the way we communicate, so we can target specific users and make sure sounds are heard by the people necessary with the right tone," Riccardo Giraldi, Zoox's senior director of product experience, tells Axios.
- "Right now, there's just one sound — honking — which is annoying for cities," he said.
The big picture: America's roads are getting deadlier, with U.S. traffic deaths up 18% in the first half of 2021, due mostly to risky behavior like speeding, texting or driving while intoxicated.
- The U.S. is on track for more than 40,000 road fatalities in 2021 — "a crisis," according to Transportation Secretary Pete Buttigieg.
The bottom line: Just as everyone understands the meaning of red, yellow and green traffic lights, AVs will need to develop standard ways of communicating.