AI advances at detecting cancer — but it can't see you now
- Alison Snyder, author of Axios Science

Illustration: Rebecca Zisser / Axios
Medicine is poised to be one place where AI makes a mark. In a study published this week, researchers report that a machine algorithm was as good — or better — than pathologists at detecting the spread of a type of breast cancer.
For all the talk about the promise of AI radically changing medicine, this is one of the first peer-reviewed studies to back claims that algorithms can detect abnormalities in pathology slides, says Eric Topol from the Scripps Research Institute.
The bottom line: Radiologists and pathologists are likely to be the first in medicine affected by AI. But researchers working on the technologies don't see them replacing doctors, and instead aiding them. And even that role will require more data about the impact on the medical profession and whether AIs are accurate enough to diagnose patients.
“It is the early days,” Aidoc CEO Elad Walach says. “There’s not enough research at this point. Deep learning has been commoditized generally but it hasn’t been commoditized for the medical domain. The algorithms out there aren’t good enough as is. We need a lot of R&D to make AI work in this space. It is not just plug and play.”
What’s new: Babak Ehteshami Bejnordi and his colleagues from Radboud University Medical Center in the Netherlands evaluated algorithms submitted in a competition to analyze tissue samples from the lymph nodes of breast cancer patients. (Cancer cells are most likely to spread to these nearby areas first so they're involved in determining a patient's prognosis.) They then compared the accuracy of AI diagnoses with those of pathologists in two different situations where the researchers had a gold-standard test to check both:
- A panel of 11 pathologists had two hours to review 129 digitized images of samples from patients who had already received a diagnosis from a pathologist.
- One pathologist was given unlimited time to review all the cases. (The expert took 30 hours.)
The result: The top seven algorithms — all deep learning methods, which have lately seen progress in image and pattern recognition — performed better than the pathologists in identifying the metastases, but were on par with the pathologist whose time wasn’t restricted.
Keep in mind: The time constraints put — or not — on the pathologists in the study aren't the reality in which they practice. And, the AIs detected just one type of breast cancer. "We need to see it borne out across lots of other pathologies not just lymph nodes for breast cancers," says Topol. "This is the most impressive paper yet. But there are limitations. This is done in silico and is not a real world validation."
More opinions
PathAI: Andy Beck, whose team won the AI competition in the new study and who is now CEO of PathAI, says AI’s arrival to pathology will be a transformation rather than a disruption.
Seeing it as the latter “betrays a lack of understanding of how these fields operate. There are so many things physicians do. Typically an AI does one specific thing very well. We aren’t even close to doing the whole breadth of what a physician does.”
Aidoc: This Israeli startup is developing technology that can detect visual abnormalities — whether it be a cancer, stroke, bleeding or an edema — in head and neck CT scans. Their focus right now, says Walach, is on optimizing radiologists’ workflow from the current practice of reviewing cases in order received to getting AI to flag urgent ones first.
They’re currently testing their technology in 5 U.S. sites. Early, unpublished results at one hospital found that the AI could spot an abnormal scan with 98% sensitivity compared to what clinicians call the "ground truth" (in this study, the diagnosis by three radiologists working without AI), says Walach. They will look to publish their findings soon.
Walach says, “There is a need for peer reviewed publications about the outcomes not just the accuracy of these systems, and leading companies should invest time and resources in publishing clinical evidence.”
A big challenge: Like other cancer tests, there is a risk of detecting — and then treating — a cancer that isn't there. That isn't unique to machines but "algorithms are tuned to perform at maximum sensitivity, meaning there may be false positives," says Stanford University's Daniel Rubin, who develops imaging tools for radiology. “As we introduce these technologies, if people don’t improve accuracy and there are more false positives, it will increase the cost of health care.”
Go deeper: We asked four medical experts whether AI might help their profession