Sep 18, 2023 - Technology

New AI tools are helping doctors screen for mental health conditions

Illustration of a hand reaching out darkness towards a hand in a white lab coat reaching out of a computer

Illustration: Sarah Grillo/Axios

The health system in Britain is now deploying AI-powered mental health tools in large-scale clinical settings, while U.S. health insurance companies are trialing them.

Why it matters: AI may be able to help health systems address an overload of patients in need of mental health care.

Driving the news: A diagnostic "e-triage" tool from Limbic, a British AI startup, has screened more than 210,000 patients with a claimed 93% accuracy across the eight most common mental disorders, including depression, anxiety and PTSD, co-founder Ross Harper, a computational neuroscientist, told Axios.

  • The U.K. National Health Service found that the Limbic Access chatbot arms its psychologists and psychiatrists with information to help them determine the severity and urgency of a potential patient's needs. Misdiagnosis is down — with 45% fewer treatment changes.
  • Kintsugi, an American startup that has raised more than $28 million from investors and the National Science Foundation, uses a different approach: its AI-powered voice analysis tool looks for signs of clinical depression and anxiety in short clips of speech.
  • Some clinical call centers, telehealth apps and remote patient monitoring systems have already integrated the tool. In a recently published case study Kintsugi conducted with a large U.S. insurer, 4 in 5 patients consented to be screened using the tool — far exceeding a pre-study estimate that only 1 in 4 would consent.

The big picture: Treating mental health is notoriously both complicated and underfunded — general practitioners are only able to correctly diagnose depression in around 50% of cases, and the number of mental health professionals cannot keep up with demand.

Yes, but: Because health care and AI hallucinations aren't a good mix, mental health clinicians have often been hesitant to deploy AI and unsure of how to introduce the concept to patients without overwhelming them.

What's happening: Limbic Access is approved in Britain as the equivalent of a Class II medical device — which the FDA classifies as medium-risk (examples include syringes and electric wheelchairs) — and the company is now looking to expand in the U.S.

  • Limbic Access saves most clinicians 40 minutes in delivering an in-depth clinical assessment, freeing them to screen more patients and cut waitlists, Harper told Axios.
  • "This isn't just another app that promises you cognitive behavioral therapy tips. It's patients with a diagnosis or getting a diagnosis using Limbic as part of their care," Harper said.
  • Harper says that makes Limbic the only mental health chatbot "allowed to take on some level of clinical responsibility." That's critical, he said, because the work of human clinicians will only scale to meet demand once part of the process can be automated.

The intrigue: Right now, text-based and voice-based initial assessments of a person's mental health offer competing ways to use AI to scale mental health care services. A multi-modal system incorporating both has yet to emerge.

  • Grace Chang, Kintsugi's founder and CEO, told Axios, "It's not what somebody says, it's how they're saying it that really matters."
  • Kintsugi's system uses data from 250,000 people who made voice journals to identify "voice biomarkers."
  • Limbic considered but ultimately rejected speech analysis because of the possible barriers it could create for people seeking treatment — from the hassle of enabling microphones to concerns over recording their sensitive thoughts.

What they're saying; Personal experience motivated Kintsugi co-founders Chang and Rima Seiilova-Olson to start their company — each of them struggled for months to secure their first therapy appointment.

  • "We saw this infrastructure issue where you just have so many people trying to jam through that front door, but not a lot of visibility as to who is severely depressed and who is low to moderate," Chang said.
  • Harper saw similar issues in England: "There are not enough trained mental health professionals on the planet to serve the astronomical disease prevalence," he said.
Go deeper