Illustration: Aïda Amer/Axios

Brain–computer interfaces, once used exclusively for clinical research, are now under development at several wealthy startups and a major tech company, and rudimentary versions are already popping up in online stores.

Why it matters: If users unlock the information inside their heads and give companies and governments access, they're inviting privacy risks far greater than today's worries over social media data, experts say — and raising the specter of discrimination based on what goes on inside a person's head.

What's happening: Machines that read brain activity from outside the head, or in some cases even inside the skull, are still relatively limited in the data they can extract from wearers' brains, and how accurately they can interpret it.

  • But the tech is moving fast. We can now recognize basic emotional states, unspoken words and imagined movements — all by analyzing neural data.
  • Researchers have found similarities in the way different people's brains process information, such that they can make rough guesses at what someone is thinking about or doing based on brain activity.

"These issues are fundamental to humanity because we're discussing what type of human being we want to be," says Rafael Yuste, a neuroscientist at Columbia.

The big picture: Clinical brain–computer interfaces can help people regain control of their limbs or operate prosthetics. Basic headsets are being sold as relaxation tools or entertainment gadgets — some built on flimsy claims — and market researchers are using the devices to fine-tune advertising pitches.

  • Facebook and startups like Elon Musk's Neuralink are pouring money into a new wave of neurotechnology with bold promises, like typing with your thoughts or, in Musk's words, merging with AI.
  • All of these devices generate huge amounts of neural data, potentially one of the most sensitive forms of personal information.

Driving the news: Neuroethicists are sounding the alarm.

  • Earlier this month the U.K.'s Royal Society published a landmark report on the promise and risk of neurotechnology, predicting a "neural revolution" in the coming decades.
  • And next month Chilean lawmakers will propose an amendment to the country's constitution enshrining protections for neural data as a fundamental human right, according to Yuste, who is advising on the process.

A major concern is that brain data could be commercialized, the way advertisers are already using less intimate information about people's preferences, habits and location. Adding neural data to the mix could supercharge the privacy threat.

  • "Accessing data directly from the brain would be a paradigm shift because of the level of intimacy and sensitivity of the information," says Anastasia Greenberg, a neuroscientist with a law degree.
  • If Facebook, for example, were to pair neural data with its vast trove of personal data, it could create “way more accurate and comprehensive psychographic profiles,” says Marcello Ienca, a health ethics researcher at ETH Zurich.
  • There's little to prevent companies from selling and trading brain data in the U.S., Greenberg found in a recent peer-reviewed study.

Neural data, more than other personal information, has the potential to reveal insights about a brain that even that brain's owner may not know.

  • This is the explicit promise of "neuromarketing," a branch of market research that uses brain scans to attempt to understand consumers better than they understand themselves.
  • Ethicists worry that information hidden inside a brain could be used to discriminate against people — for example, if they showed patterns of brain activity that were similar to patterns seen in people with propensities for addiction, depression or neurological disease.

"The sort of future we're looking ahead toward is a world where our neural data — which we don't even have access to — could be used" against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.

Go deeper: Advertisers want to mine your brain

Editor's note: This story has been updated to clarify Marcello Ienca's quote.

Go deeper

Trump's 2 chilling debate warnings

Photo: Morry Gash/Pool via Getty Images

One of the few groups in America with anything to celebrate after last night's loud, ugly, rowdy presidential "debate" was the violent, far-right Proud Boys, after President Trump pointedly refused to condemn white supremacist groups.

Why it matters: This was a for-the-history-books moment in a debate that was mostly headache-inducing noise. Trump failed to condemn racist groups after four months when millions marched for racial justice in the country's largest wave of activism in half a century.

Ina Fried, author of Login
38 mins ago - Technology

Candidates go online to cut through debate noise

Photo: Saul Loeb/AFP via Getty Images

While President Trump and Joe Biden fought to be heard in a rowdy debate Tuesday, both campaigns sought to draw digital battle lines and occupy online turf they could have all to themselves.

The big picture: Trump's impulsive Twitter style made a shambles of the debate format, but online the candidates were able to find niches where they couldn't be interrupted — and could motivate their supporters to donate, organize and turn out to vote.

Ben Geman, author of Generate
1 hour ago - Energy & Environment

Shell plans up to 9,000 job cuts by 2022

A Shell station in Brazil. Photo: Rafael Henrique/SOPA Images/LightRocket via Getty Images

Royal Dutch Shell will shed up to 9,000 jobs as it undergoes a long-term restructuring around climate-friendly energy sources and continues to grapple with the coronavirus pandemic that has battered the oil industry.

Why it matters: The cuts could amount to over 10% of the company's global workforce, which was 83,000 at the end of 2019.

Get Axios AM in your inbox

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Subscription failed
Thank you for subscribing!