Stories

There's now a self-help Facebook Messenger chatbot

If you're feeling down, a new chatbot could help. Woebot, which is integrated into Facebook Messenger and launched today, will deliver therapy by periodically checking in on users, asking them how they feel and making suggestions — for relaxing music or not talking negatively about themselves — based on their responses.

The artificially intelligent chatbot can deliver personalized mental health care that makes people feel measurably better, according to a new study. "It's a nice, elegant, simple application that shows a glimmer of what might be to come," says Skip Rizzo, psychologist, University of Southern California.

Why it's needed: Access to therapy can be limited for some (both physically and financially) and stigma around mental health care persists. Woebot, along with apps and other tech-based tools, attempts to deliver therapy but its effectiveness hasn't been clinically studied.

How it works: Woebot uses the tools of cognitive behavioral therapy and relies on a decision tree that mirrors the decision-making of therapists while speaking with patients. In the study, 34 college students who reported symptoms of depression and anxiety spent two weeks chatting with Woebot while 36 people in the control group were directed to the National Institute of Mental Health's e-book on depression. After 14 days, people who had been conversing anonymously with Woebot said their symptoms of depression were reduced (the control group's remained the same).

Man v. machine: Woebot's creator, psychologist Alison Darcy, says the bot isn't designed for diagnosis or intended to replace human therapy. Right now, people have to choose between nothing and regularly seeing a psychologist and she says Woebot is one of the few options in between.

  • Woebot may be superior to human therapists in the sense that people are as likely if not more to disclose mental health information to a computer compared to a human because of the stigma of receiving mental health care.
  • But... "Mental health problems are more nuanced and with a bot it can be hard to detect changes in behavior, particularly when it comes to suicide," says UCSF's Danielle Ramo, who was not involved in the study. People are sometimes in a better mood once they've decided to take their own life - something a therapist might recognize as a distress signal whereas a bot may not.

Study limitations:

  • The author's note the small number of participants, all undergraduate students meaning the results can't be generalized to the entire population.
  • Short duration: students interacted with Woebot for only two weeks and follow-up studies are required to see if the outcomes are long-term.
  • Woebot should be compared to in-person theory to tease out whether the effect is specifically from the bot or just from any interaction as opposed to reading a website.