What self-driving cars can learn from military drones
Add Axios as your preferred source to
see more of our stories on Google.

The U.S. used new LUCAS attack drones against Iran. Photo: Department of War
Self-driving car companies could learn a lot by studying the early mistakes of combat drones.
Why it matters: Autonomous vehicles, like military drones, need remote supervisors to support their operations. But managing robots from afar can sometimes be problematic, especially in a crisis.
- Communication delays, or a stressed remote operator, could risk human lives, says Missy Cummings, a former Navy fighter pilot who later became an expert in autonomous vehicles.
Inside the room: Cummings, now a George Mason University robotics professor, worked in flight operations for the Navy in the early 1990s during the Gulf War.
- Later, as a PhD student, she was one of the first researchers to examine how to improve remote supervision of unmanned aerial vehicles (UAVs).
- Sitting "shoulder-to-shoulder in tiny trailers" with drone operators, she looked for ways to improve remote supervision of UAVs, including those flying halfway around the world.
Threat level: Now, as robotaxis hit the road with remote agents monitoring them in the background, she's seeing many of the same mistakes made by the U.S. military in the early days of UAVs, she writes in the tech engineering magazine IEEE Spectrum.
Here are five major challenges the military encountered during drone operations which provide valuable lessons for self-driving cars, writes Cummings.
Low latency: Delayed communication makes remote vehicle control difficult, the military learned.
- Self-driving car companies rely on sometimes spotty cellphone networks to deliver commands, which could lead to time lags.
- In one incident, a Waymo operator instructed a car to turn left when a traffic light appeared yellow in the remote video feed, but had already turned red in the real world.
- Having Waymo's remote assistants closer, not in the Philippines, would help, says Cummings.
Poor design: In the military, human errors led to many drone crashes because of confusing controls. In one case, remote operators inadvertently shut the engine down instead of launching a missile.
- Some companies use off-the-shelf video game controllers to operate autonomous shuttle buses. That can lead to mode confusion, which was a factor in the recent crash in Orlando, says Cummings.
Operator workload: Monitoring many military drones at once can be overwhelming. But when things are quiet, supervisors can get bored and become less alert.
- The same is true of robotaxi monitors, who can become overwhelmed in an emergency.
Training: The military changed its training program for drone operators once it better understood the knowledge and skills needed to remotely control UAVs.
- For self-driving cars, there are no regulations governing the qualifications for remote operators.
- Waymo says its remote assistance agents are required to have a driver's license in the Philippines, and they're retrained every six months.
Contingency planning: In the military, drones may fly themselves to safe areas or land autonomously if contact is lost.
- A recent power outage in San Francisco suggests robotaxis are not well prepared for emergencies.
The bottom line: Remote operations are more than a support feature for autonomous vehicles. They're a critical safety system.
