Sep 21, 2019

Brains are the last frontier of privacy

Illustration: Aïda Amer/Axios

Brain–computer interfaces, once used exclusively for clinical research, are now under development at several wealthy startups and a major tech company, and rudimentary versions are already popping up in online stores.

Why it matters: If users unlock the information inside their heads and give companies and governments access, they're inviting privacy risks far greater than today's worries over social media data, experts say — and raising the specter of discrimination based on what goes on inside a person's head.

What's happening: Machines that read brain activity from outside the head, or in some cases even inside the skull, are still relatively limited in the data they can extract from wearers' brains, and how accurately they can interpret it.

  • But the tech is moving fast. We can now recognize basic emotional states, unspoken words and imagined movements — all by analyzing neural data.
  • Researchers have found similarities in the way different people's brains process information, such that they can make rough guesses at what someone is thinking about or doing based on brain activity.

"These issues are fundamental to humanity because we're discussing what type of human being we want to be," says Rafael Yuste, a neuroscientist at Columbia.

The big picture: Clinical brain–computer interfaces can help people regain control of their limbs or operate prosthetics. Basic headsets are being sold as relaxation tools or entertainment gadgets — some built on flimsy claims — and market researchers are using the devices to fine-tune advertising pitches.

  • Facebook and startups like Elon Musk's Neuralink are pouring money into a new wave of neurotechnology with bold promises, like typing with your thoughts or, in Musk's words, merging with AI.
  • All of these devices generate huge amounts of neural data, potentially one of the most sensitive forms of personal information.

Driving the news: Neuroethicists are sounding the alarm.

  • Earlier this month the U.K.'s Royal Society published a landmark report on the promise and risk of neurotechnology, predicting a "neural revolution" in the coming decades.
  • And next month Chilean lawmakers will propose an amendment to the country's constitution enshrining protections for neural data as a fundamental human right, according to Yuste, who is advising on the process.

A major concern is that brain data could be commercialized, the way advertisers are already using less intimate information about people's preferences, habits and location. Adding neural data to the mix could supercharge the privacy threat.

  • "Accessing data directly from the brain would be a paradigm shift because of the level of intimacy and sensitivity of the information," says Anastasia Greenberg, a neuroscientist with a law degree.
  • If Facebook, for example, were to pair neural data with its vast trove of personal data, it could create “way more accurate and comprehensive psychographic profiles,” says Marcello Ienca, a health ethics researcher at ETH Zurich.
  • There's little to prevent companies from selling and trading brain data in the U.S., Greenberg found in a recent peer-reviewed study.

Neural data, more than other personal information, has the potential to reveal insights about a brain that even that brain's owner may not know.

  • This is the explicit promise of "neuromarketing," a branch of market research that uses brain scans to attempt to understand consumers better than they understand themselves.
  • Ethicists worry that information hidden inside a brain could be used to discriminate against people — for example, if they showed patterns of brain activity that were similar to patterns seen in people with propensities for addiction, depression or neurological disease.

"The sort of future we're looking ahead toward is a world where our neural data — which we don't even have access to — could be used" against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.

Go deeper: Advertisers want to mine your brain

Editor's note: This story has been updated to clarify Marcello Ienca's quote.

Go deeper

Facebook to buy a leading brain interface startup

Facebook's Oculus VR headset. Photo: Amy Osborne / AFP / Getty

Facebook announced Monday that it will buy CTRL-Labs, a startup developing an arm-worn device that reads brain signals. The companies did not disclose the deal's price tag, but reports range from upwards of $500 million to as much as $1 billion.

Why it matters: Facebook has been developing its own brain–machine interface for several years, but this is a major acquisition that could propel its technology quickly forward — and in a way that's potentially less invasive of users' privacy.

Go deeperArrowSep 24, 2019

The future of privacy starts in California

Illustration: Sarah Grillo/Axios

A landmark privacy law in California, which kicks in Jan. 1, will give Golden State residents the right to find out what a company knows about them and get it deleted — and to stop the company from selling it.

Why it matters: It could effectively become a national privacy law, since companies that are racing to comply with it may give these privileges to non-Californians, too.

Go deeperArrowSep 30, 2019

What College Board knows about you

Illustration: Rebecca Zisser/Axios

If you've taken a college entry test in the last few years, your personal information may have been used to decide which colleges can recruit you.

Why it matters: Universities and other educational organizations are buying high schoolers' personal data from SAT administrator College Board to target and recruit future students. More than 3 million students in 2018 gave up their personal information in the process of taking the SAT, ACT and PSAT, the New York Times reports.

Go deeperArrowOct 5, 2019