Sign up for our daily briefing

Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Twin Cities

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa Bay news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa Bay

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Charlotte news in your inbox

Catch up on the most important stories affecting your hometown with Axios Charlotte

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!
AI segments a 3D eye scan into sections representing different types of tissue. Animation: DeepMind

Doctors at a U.K. eye hospital are getting algorithmic help interpreting the results of 3D eye scans, using a system developed at Google's DeepMind that can identify more than 50 eye problems and recommend a course of action with human expert-level accuracy.

Why it matters: DeepMind's system shows an intermediate step in its work, and tells doctors how confident it is in its assessment. This is crucial, because AI systems are often too opaque to be able to explain their reasoning, making them risky to deploy in high-stakes environments like hospitals.

The big picture: Black-box algorithms make it difficult to check results for accuracy, and their ambiguous reasoning can decrease trust in their recommendations.

  • DeepMind’s two-step approach, which was published Monday in the Nature Medicine journal, tries to mitigate those problems.
  • The system has been tested for two years at Moorfields Eye Hospital in London, and still needs to undergo clinical trials before it can be implemented more broadly.

How it works: The system reads optical coherence tomography (OCT) scans, which are 3D representations of the back of a patient's eye.

  • In the first step, a neural network segments the scan, which is difficult for humans to read in its raw form, into colored areas that represent different types of tissue.
  • Then, the system analyzes the segmentation map with a second neural network, and identifies signs of disease using a separate classification network.
  • This second step offers clinicians the most likely diagnosis and recommends a course of action.
  • The results are paired with a percentage that shows the system’s confidence in a diagnosis or recommendation.
"One of the reasons we're putting so much effort into explainability and interpretation is that we desperately want to build trust with nurses and doctors."
— Mustafa Suleyman, DeepMind’s co-founder

The system’s 5.5% error rate matches or exceeds the accuracy of human eye experts, the DeepMind and University College London researchers wrote in the paper.

Since OCT scans can be ambiguous — different eye doctors will often interpret them differently — the DeepMind system’s recommendation is the result of not one analysis but a combination of 25 of them.

  • DeepMind uses five slightly different segmentation networks to create five eye diagrams. Then, it runs five slightly different classification networks on each of the five segmentation maps, resulting in 25 interpretations.
  • The confidence percentages displayed to clinicians show the results of these iterations. If nearly all the analyses indicated that a patient has choroidal neovascularization, that diagnosis would be highlighted with high confidence in the final result.

The two-step process also helps make it easy to retrain the system for different scanning equipment, allowing it to work on new, state-of-the-art equipment soon after it comes out — or older scanners that might not be as accurate.

  • Without the intermediate step, the system would need to see tens or even hundreds of thousands of scans from a new piece of equipment in order to learn to interpret them correctly, DeepMind Health research lead Trevor Back told Axios.
  • But DeepMind’s system needs fewer than 200 scans to train the segmentation network, which provides the color-coded map used for diagnoses and recommendations, for new equipment.
  • The DeepMind team focused on their products’s generalizability, Back said, in an effort to create a system that will actually be useful in eye clinics and hospitals.

What’s next: Since OCT scans are 3D, the technology DeepMind developed to analyze them could be useful for other types of 3D medical imaging, like CT scans, Suleiman said.

  • "3D imaging is one of the harder modalities to work on," he told Axios. "We want to learn as much as possible about the way our algorithms work in order to use them in other areas of radiology."
  • Suleiman said this research could help advance fundamental research into AI image and video understanding beyond hospital uses.

Go deeper: Read a DeepMind blog post about the new research, or the Nature Medicine paper for more technical details.

Go deeper

The new Washington

Illustration: Sarah Grillo/Axios

The Axios subject-matter experts brief you on the incoming administration's plans and team.

Rep. Lou Correa tests positive for COVID-19

Lou Correa. Photo: Tom Williams/CQ-Roll Call, Inc via Getty Images

Rep. Lou Correa (D-Calif.) announced on Saturday that he has tested positive for the coronavirus.

Why it matters: Correa is the latest Democratic lawmaker to share his positive test results after last week's deadly Capitol riot. Correa did not shelter in the designated safe zone with his congressional colleagues during the siege, per a spokesperson, instead staying outside to help Capitol Police.

Far-right figure "Baked Alaska" arrested for involvement in Capitol siege

Photo: Shay Horse/NurPhoto via Getty Images

The FBI arrested far-right media figure Tim Gionet, known as "Baked Alaska," on Saturday for his involvement in last week's Capitol riot, according to a statement of facts filed in the U.S. District Court in the District of Columbia.

The state of play: Gionet was arrested in Houston on charges related to disorderly or disruptive conduct on the Capitol grounds or in any of the Capitol buildings with the intent to impede, disrupt, or disturb the orderly conduct of a session, per AP.

You’ve caught up. Now what?

Sign up for Mike Allen’s daily Axios AM and PM newsletters to get smarter, faster on the news that matters.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!