Sign up for our daily briefing

Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Twin Cities

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa Bay news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa Bay

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Charlotte news in your inbox

Catch up on the most important stories affecting your hometown with Axios Charlotte

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Lazaro Gamio/Axios

As the science of brain-computer interfaces (BCI) and other neurotechnologies progresses, researchers are calling for ethical guidelines to be established now — before the technology fully matures.

Why it matters: We’re still far away from technologies that fully access and even read the human brain, but the sheer power of such tools — and the highly personal data they could gather — means society needs to determine what they should do before they actually can do it.

What’s happening: Columbia University’s NeuroRights Initiative held a symposium today in conjunction with IBM on the scientific, security and social issues raised by neurotech.

  • Today scientists are able to read and write information in the brains of animals, and they’ve developed interfaces that allow humans to move computer cursors and more with only their thoughts.

The big picture: In the future, BCIs could provide an unprecedented view of the human brain at work, which in turn could unlock new clinical insights into largely untreatable mental and neurological diseases, as well as change how humans interface with the world.

  • “In 10 to 15 years, we could have something in our heads [that's] like an iPhone in our pockets,” Rafael Yuste, the director of Columbia’s NeuroTechnology Center, said at the symposium.

What they’re saying: The ethical issues raised by that power were the focus of IBM director of research Darío Gil’s symposium remarks, which touched on first-generation ethical principles for neurotech developed by the company.

  • “As the power of technology continues to increase, the governance of technology needs to go along with it,” Gil told Axios before the symposium.
  • “Every player that develops and creates technology that is at the cutting edge has a responsibility because the purpose of technologies is as a tool to help society.”

Details: Many of the ethical issues created by BCI — questions of transparency and fairness — resemble those raised by AI or even social media, only intensified.

  • It’s one thing for tech companies to track what we click on and what we watch, but data generated by the nervous system can be unconscious, which could fatally undermine principles of consent and the privacy.
  • And neurotechnology could go beyond reading the brain to effectively coding it, feeding it data that could influence thoughts and behaviors, which brings into question core concepts around free will.

To that end, Gil says IBM is committed to respecting mental privacy and autonomy, being transparent in its neurotech work, and ensuring that people have an equal opportunity to choose whether or not they want to use the technology.

  • The role of government isn’t yet clear, but Gil foresees something for neurotech like the White House Council on Bioethics, which in the past debated policies on stem cells, genetic engineering and more.

The catch: Scientific codes of ethics may not mean that much to notoriously independent players like Elon Musk, who has made promises about the potential for the BCI technology developed by his company Neuralink to eventually allow “AI symbiosis,” as he said at an event in August.

The bottom line: BCI could be a “revolution for humanity,” as Yuste put it. But revolutions have a way of getting out of hand.

Go deeper

Bryan Walsh, author of Future
Nov 18, 2020 - Technology

The robo-job apocalypse is being delayed

Illustration: Eniola Odetunde/Axios

A sprawling new report makes the case that automation and AI won't lead to widespread job destruction anytime soon.

Why it matters: Technological advances in AI and automation will have an enormous impact on the workforce, but it may take decades for those effects to be fully felt. That gives business leaders and politicians a last chance to change labor and education policies that have left too many workers locked in low-quality, low-paying jobs.

Updated 2 hours ago - Politics & Policy

Coronavirus dashboard

Illustration: Annelise Capossela/Axios

  1. Health: CDC director defends agency's response to pandemic — CDC warns highly transmissible coronavirus variant could become dominant in U.S. in March.
  2. Politics: Biden readies massive shifts in policy for his first days in office.
  3. Vaccine: Fauci: 100 million doses in 100 days is "absolutely" doable.
  4. Economy: Unemployment filings explode again.
  5. Tech: Kids' screen time sees a big increase.
  6. World: WHO team arrives in China to investigate pandemic origins.
Dave Lawler, author of World
3 hours ago - World

Alexey Navalny detained after landing back in Moscow

Navalny and his wife shortly before he was detained. Photo: Kirill Kudryavtsev/AFP via Getty

Russian opposition leader Alexey Navalny was detained upon his return to Moscow on Sunday, which came five months after he was poisoned with the nerve agent Novichok. He returned despite being warned that he would be arrested.

The latest: Navalny was stopped at a customs checkpoint and led away alone by officers. He appeared to hug his wife goodbye, and his spokesman reports that his lawyer was not allowed to accompany him.