Nov 1, 2017

Researchers go after the biggest problem with self-driving cars

Illustration: Lazaro Gamio / Axios

The biggest difficulty in self-driving cars is not batteries, fearful drivers, or expensive sensors, but what's known as the "trolley problem," a debate over who is to die and who saved should an autonomously driven vehicle end up with such a horrible choice on the road. And short of that, how will robotic vehicles navigate the countless other ethical decisions, small and large, executed by drivers as a matter of course?

In a paper, researchers at Carnegie Mellon and MIT propose a model that uses artificial intelligence and crowd sourcing to automate ethical decisions in self-driving cars. "In an emergency, how do you prioritize?" Ariel Procaccia, a professor at Carnegie Mellon, tells Axios.

The bottom line: The CMU-MIT model is only a prototype at this stage. But it or something like it will have to be mastered if fully autonomous cars are to become a reality.

"We are not saying that the system is ready for deployment. But it is a proof of concept, showing that democracy can help address the grand challenge of ethical decision making in AI," Procaccia said.

How they created the system: Procaccia's team used a model at MIT called the Moral Machine, in which 1.3m people gave their ethical vote to around 13 difficult, either-or choices in trolley-like driving scenarios. In all, participants provided 18.2 million answers. The researchers used artificial intelligence to teach their system the preferences of each voter, then aggregated them, creating a "distribution of societal preferences," in effect the rules of ethnical behavior in a car. The researchers could now ask the system any driving question that came to mind; it was as though they were asking the original 1.3 million participants to vote again.

A robot election: "When the system encounters a dilemma, it essentially holds an election, by deducing the votes of the 1.3 million voters, and applying a voting rule," Procaccia said. He said, "This allows us to give the following strong guarantee: the decision the system takes is likely to be the same as if we could go to each of the 1.3 million voters, ask for their opinions, and then aggregate their opinions into a choice that satisfies mathematical notions of social justice."

Go deeper

Trump's clemency spree

Rod Blagojevich in 2010. Photo: Scott Olson/Getty Images

President Trump announced Tuesday that he commuted former Illinois Gov. Rod Blagojevich's 14-year prison sentence for extortion, bribery and corruption — as well as issuing full pardons for former San Francisco 49ers owner Edward DeBartolo Jr., former NYPD Commissioner Bernie Kerik and financier Michael Milken.

The big picture: The president's clemency spree largely benefitted white-collar criminals convicted of crimes like corruption, gambling fraud and racketeering, undercutting his message of "draining the swamp."

Go deeperArrowUpdated 2 hours ago - Politics & Policy

Trump's improbable moonshot

Illustration: Aïda Amer/Axios

NASA is unlikely to meet its deadline of sending astronauts to the surface of the Moon by 2024, even with a large influx of funding.

Why it matters: The Artemis mission to send people back to the Moon is the Trump administration's flagship space policy, and its aggressive, politically-motivated timeline is its hallmark.

Go deeperArrow3 hours ago - Science

Justice Department says U.S. attorneys are reviewing Ukraine information

Rudy Giuliani. Photo: Roy Rochlin/Getty Images

Assistant Attorney General Stephen Boyd sent a letter to House Judiciary Chairman Jerry Nadler (D-N.Y.) Tuesday informing him that the U.S. attorneys for the Eastern District of New York and the Western District of Pennsylvania are reviewing "unsolicited" information from the public related to matters involving Ukraine.

Why it matters: Nadler had requested an explanation for the "intake process" that Attorney General Bill Barr stated had been set up in order to receive information that Rudy Giuliani had obtained about the Bidens in Ukraine.