Sign up for our daily briefing

Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Twin Cities

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa Bay news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa Bay

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Charlotte news in your inbox

Catch up on the most important stories affecting your hometown with Axios Charlotte

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Lazaro Gamio/Axios

The advancement of AI-fueled technologies like robotics and self-driving cars is creating a confusing legal landscape that leaves manufacturers, programmers and even robots themselves open to liability, according to legal scholars who study AI.

Why it matters: As autonomous vehicles take to the road and get into collisions, drivers, insurers and manufacturers want to know who — or what — is liable when a harmful mistake occurs. The degree of liability comes down to whether AI is treated as a product, service or a human decision-maker.

Case study: Some carmakers, including Volvo, Google and Mercedes, have already said they would accept full liability for their vehicles' actions when they are in autonomous mode.

Even without such a pledge, it's likely that manufacturers would end up paying if their autonomous car caused harm. If the offending car were considered a defective product, its maker could be held liable under strict product-design standards, potentially leading to class-action lawsuits and expensive product recalls — like Takata faced for its dangerous airbags.

  • But if a car's driving software were considered a service, it could be charged with behaving negligently, the way a reckless human driver might.
  • Treating human and AI drivers the same would "level the playing field" and prevent costly product-liability lawsuits, according to Nathan Greenblatt, an IP lawyer at Sidley Austin.
  • Yes, but: Exactly who counts as the manufacturer might not be immediately obvious, says John Kingston, a professor who studies AI and law at the University of Brighton. If a self-driving car kills a pedestrian, the car company might be considered the manufacturer, he says — but it could also be a subcontractor who wrote the software, or even a hardware supplier that produced a faulty camera.
  • Allianz, the insurance giant, predicted in a recent report that product liability insurance would someday become compulsory in order to protect drivers when they've put their cars in autonomous mode.

Another possibility: Going deeper into the system, the AI itself could be held responsible, according to Gabriel Hallevy, a law professor at Ono Academic College in Israel, who wrote a book about AI and criminal negligence. That still means its programmer or manufacturer could be found negligent as well, or even an accomplice to a crime.

  • But it's hard to punish AI if it's found guilty. The simplest sanction would be to decommission the offending robot or the program. But there are more creative options: Hallevy suggested that AI found to have broken a law could be shut off for a period of time — the equivalent of a prison sentence — or even be required to perform community service, like cleaning the streets or helping out at the public library.

What's needed: New laws may be in order to deal with errant AI, says Kingston. Many laws, for example, hinge on whether a reasonable person would have acted a certain way. AI, clever as it may be in its own field, doesn't yet have the background knowledge or common sense it would need to emulate a reasonable person's decision-making.

What to expect: The first big AI liability case will likely cause a temporary chill in AI development, says Kingston, as company lawyers scramble to protect their employers. But in the long term, he says, clearer guidelines would be beneficial for AI research and development.

Go deeper

Trump to issue at least 100 pardons and commutations before leaving office

Photo: Mandel Ngan/AFP via Getty Images

President Trump plans to issue at least 100 pardons and commutations on his final full day in office Tuesday, sources familiar with the matter told Axios.

Why it matters: This is a continuation of the president's controversial December spree that saw full pardons granted to more than two dozen people — including former Trump campaign chair Paul Manafort, longtime associate Roger Stone and Charles Kushner, the father of Trump's senior adviser and son-in-law, Jared Kushner.

  • The pardons set to be issued before Trump exits the White House will be a mix of criminal justice ones and pardons for people connected to the president, the sources said.
  • CNN first reported this news.

Go deeper: Convicts turn to D.C. fixers for Trump pardons

Schumer's m(aj)ority checklist

Senate Minority Leader Chuck Schumer. Photo: Tasos Katopodis/Getty Images

Capitalizing on the Georgia runoffs, achieving a 50-50 Senate and launching an impeachment trial are weighty to-dos for getting Joe Biden's administration up and running on Day One.

What to watch: A blend of ceremonies, hearings and legal timelines will come into play on Tuesday and Wednesday so Chuck Schumer can actually claim the Senate majority and propel the new president's agenda.

The dark new reality in Congress

National Guard troops keep watch at security fencing. Photo: Kent Nishimura/Los Angeles Times via Getty Images

This is how bad things are for elected officials and others working in a post-insurrection Congress:

  • Rep. Norma Torres (D-Calif.) said she had a panic attack while grocery shopping back home.
  • Rep. Jim McGovern (D-Mass.) said police may also have to be at his constituent meetings.
  • Rep. Adam Kinzinger (R-Ill.) told a podcaster he brought a gun to his office on Capitol Hill on Jan. 6 because he anticipated trouble with the proceedings that day.