Get the latest market trends in your inbox

Stay on top of the latest market trends and economic insights with the Axios Markets newsletter. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Minneapolis-St. Paul

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa-St. Petersburg news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa-St. Petersburg

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Lazaro Gamio/Axios

As the science of brain-computer interfaces (BCI) and other neurotechnologies progresses, researchers are calling for ethical guidelines to be established now — before the technology fully matures.

Why it matters: We’re still far away from technologies that fully access and even read the human brain, but the sheer power of such tools — and the highly personal data they could gather — means society needs to determine what they should do before they actually can do it.

What’s happening: Columbia University’s NeuroRights Initiative held a symposium today in conjunction with IBM on the scientific, security and social issues raised by neurotech.

  • Today scientists are able to read and write information in the brains of animals, and they’ve developed interfaces that allow humans to move computer cursors and more with only their thoughts.

The big picture: In the future, BCIs could provide an unprecedented view of the human brain at work, which in turn could unlock new clinical insights into largely untreatable mental and neurological diseases, as well as change how humans interface with the world.

  • “In 10 to 15 years, we could have something in our heads [that's] like an iPhone in our pockets,” Rafael Yuste, the director of Columbia’s NeuroTechnology Center, said at the symposium.

What they’re saying: The ethical issues raised by that power were the focus of IBM director of research Darío Gil’s symposium remarks, which touched on first-generation ethical principles for neurotech developed by the company.

  • “As the power of technology continues to increase, the governance of technology needs to go along with it,” Gil told Axios before the symposium.
  • “Every player that develops and creates technology that is at the cutting edge has a responsibility because the purpose of technologies is as a tool to help society.”

Details: Many of the ethical issues created by BCI — questions of transparency and fairness — resemble those raised by AI or even social media, only intensified.

  • It’s one thing for tech companies to track what we click on and what we watch, but data generated by the nervous system can be unconscious, which could fatally undermine principles of consent and the privacy.
  • And neurotechnology could go beyond reading the brain to effectively coding it, feeding it data that could influence thoughts and behaviors, which brings into question core concepts around free will.

To that end, Gil says IBM is committed to respecting mental privacy and autonomy, being transparent in its neurotech work, and ensuring that people have an equal opportunity to choose whether or not they want to use the technology.

  • The role of government isn’t yet clear, but Gil foresees something for neurotech like the White House Council on Bioethics, which in the past debated policies on stem cells, genetic engineering and more.

The catch: Scientific codes of ethics may not mean that much to notoriously independent players like Elon Musk, who has made promises about the potential for the BCI technology developed by his company Neuralink to eventually allow “AI symbiosis,” as he said at an event in August.

The bottom line: BCI could be a “revolution for humanity,” as Yuste put it. But revolutions have a way of getting out of hand.

Go deeper

Bryan Walsh, author of Future
Nov 18, 2020 - Technology

The robo-job apocalypse is being delayed

Illustration: Eniola Odetunde/Axios

A sprawling new report makes the case that automation and AI won't lead to widespread job destruction anytime soon.

Why it matters: Technological advances in AI and automation will have an enormous impact on the workforce, but it may take decades for those effects to be fully felt. That gives business leaders and politicians a last chance to change labor and education policies that have left too many workers locked in low-quality, low-paying jobs.

Biden plans to ask public to wear masks for first 100 days in office

Joe Biden. Photo: Mark Makela/Gettu Images

President-elect Joe Biden told CNN on Thursday that he plans to ask the American public to wear face masks for the first 100 days of his presidency.

The big picture: Biden also stated he has asked NIAID director Anthony Fauci to stay on in his current role, serve as a chief medical adviser and be part of his COVID-19 response team when he takes office early next year.

What COVID-19 vaccine trials still need to do

Illustration: Sarah Grillo/Axios

COVID-19 vaccines are being developed at record speed, but some experts fear the accelerated regulatory process could interfere with ongoing research about the vaccines.

Why it matters: Even after the first COVID-19 vaccines are deployed, scientific questions will remain about how they are working and how to improve them.