Stories

Research participants' privacy threat

Illustration of a collage of a face being zoomed focused on with artificial intelligence framing
Illustration: Sarah Grillo/Axios

Technology has advanced to the point where research study participants can be identified by their MRI scans even after all other identifying information has been stripped, according to an experiment detailed yesterday in the New England Journal of Medicine and reported on by the New York Times.

Why it matters: If stored medical data were leaked, it could potentially be used to identify study participants for marketing, scams or even stalking.

The big picture, per the WSJ: These "results are the latest to find technology has outflanked privacy protections in health care, where an aggressive push is under way to amass and mine medical data from patient medical records, research, medical devices and consumer technology such as smartwatches."

Details: An MRI includes a person's entire head, and imaging technology is advanced enough to create a reconstruction of the face from the scan.

  • That facial reconstruction can then be matched, in some circumstances, to a photo of the person who received the scan via facial recognition software, NYT reports.

Yes, but: The experiment, performed by researchers at the Mayo Clinic, included only 84 subjects. Some privacy experts question whether the process could be replicated among a larger population with current technology.

Our thought bubble: With ever more advanced AI, details about our bodies and behaviors — even data we’ve long forgotten we’ve shared — can come back to identify us.

Go deeper: Medical AI has a big data problem

Sign up for the Vitals newsletter

Get news and analysis on the politics and business of health care