Illustration: Sarah Grillo/Axios

Many health-related AI technologies today are biased because they're built on datasets largely comprised of men and individuals of European descent.

Why it matters: An AI system trained to identify diseases, conditions and symptoms in people in these datasets could fail when presented with data from people with different characteristics.

Background: AI-powered disease detection technology is part of the health care AI market expected to exceed $34 billion by 2025.

  • Researchers recently demonstrated that AI used in breast cancer screenings correctly identified more cancers, reduced false positives and improved reading times.

What's happening: Most medical research tends to focus on men, and most genetic data publicly available is from individuals of European descent. As AI is increasingly used in medicine, it could result in misdiagnoses of patients based on their gender, race and/or ethnicity.

  • While heart attacks generally strike men and women equally, they are more likely to be fatal in women, which can be caused by a delay in care due to gender-based differences in symptoms.
  • Similarly, if a person is not of European descent, AI medical technologies may incorrectly diagnose that person, as their symptoms and disease manifestations could differ
  • Recent studies and mishaps have shown that our current data and programs that rely on AI, like search engines and image recognition software, are biased in ways that can cause harm.

What we're watching: Some steps are being taken to ensure that AI is evaluated for bias, including proposed legislation.

  • The National Institutes of Health launched a new program last year to expand diversity in medical research and data by soliciting volunteers from populations that are currently underrepresented.

Go deeper: Scientists call for rules on evaluating predictive AI in medicine

Miriam Vogel is the executive director of Equal AI, a professor at Georgetown Law and a former associate deputy attorney general at the Department of Justice.

Go deeper

Transcripts show George Floyd told police "I can't breathe" over 20 times

Photo: Gary Coronado/Los Angeles Times via Getty Images

Newly released transcripts of bodycam footage from the Minneapolis Police Department show that George Floyd told officers he could not breathe more than 20 times in the moments leading up to his death.

Why it matters: Floyd's killing sparked a national wave of Black Lives Matter protests and an ongoing reckoning over systemic racism in the United States. The transcripts "offer one the most thorough and dramatic accounts" before Floyd's death, The New York Times writes.

Updated 2 hours ago - Politics & Policy

Coronavirus dashboard

Illustration: Eniola Odetunde/Axios

  1. Global: Total confirmed cases as of 6 p.m. ET: 11,921,616 — Total deaths: 546,318 — Total recoveries — 6,506,408Map.
  2. U.S.: Total confirmed cases as of 6 p.m. ET: 3,035,231 — Total deaths: 132,042 — Total recoveries: 936,476 — Total tested: 36,878,106Map.
  3. Public health: Deaths are rising in hotspots — Déjà vu sets in as testing issues rise and PPE dwindles.
  4. Travel: United warns employees it may furlough 45% of U.S. workforce How the pandemic changed mobility habits, by state.
  5. Education: New York City schools will not fully reopen in fallHarvard and MIT sue Trump administration over rule barring foreign students from online classes.
  6. 🎧 Podcast: A misinformation "infodemic" is here.
2 hours ago - Health

Fighting the coronavirus infodemic

Illustration: Sarah Grillo/Axios

An "infodemic" of misinformation and disinformation has helped cripple the response to the novel coronavirus.

Why it matters: High-powered social media accelerates the spread of lies and political polarization that motivates people to believe them. Unless the public health sphere can effectively counter misinformation, not even an effective vaccine may be enough to end the pandemic.