Illinois blocks AI from being your therapist
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Sarah Grillo/Axios
Illinois is taking steps to regulate AI, this time with mental health care.
Why it matters: As AI becomes more advanced and integrated into everyday life, with some people even relying on it for companionship, mental health professionals in the state have pushed for regulation on programs that could mirror therapy.
Driving the news: Gov. JB Pritzker signed the Wellness and Oversight for Psychological Resources (WOPR) Act into law last week, putting Illinois at the forefront of states placing legal boundaries around AI behavioral health care.
How it works: WOPR (yes, the supercomputer in "War Games") prohibits any AI-driven app or service from providing mental health and therapeutic decision-making, such as diagnosing a user. Violations could result in a $10,000 fine by the state's regulatory agency.
- Therapists can use AI for administrative tasks, however, such as note taking and planning.
What they're saying: "If you would have opened up a corner shop and started saying you're a clinical social worker, the department [of Professional Regulation] would shut you down pretty quickly, right? But somehow we were allowing an algorithm to work unregulated," Kyle Hillman, legislative director of the National Association of Social Workers, tells Axios.
- "Any licensed profession should be protected from misrepresentation," American Psychological Association senior director of innovation Vaile Wright said earlier this year. "You're putting the public at risk when you imply there's a level of expertise that isn't really there."
The latest: ChatGPT will now prompt users to take a break during long sessions on the platform, OpenAI announced Monday. The company also clarified that ChatGPT should help users weigh different options when posed with a personal question, rather than giving yes or no answers, Axios' Maya Goldman reports.
Zoom out: Some AI users have reported experiencing delusions while falling deep into conversations with the chatbot, with one even using ketamine after ChatGPT told him to, the New York Times reported last month.
- Some psychologists have written about "AI-induced psychosis" but point out that it's different from people who have previous mental illnesses where AI could be exacerbating the manic episode.
Between the lines: There's a distinction in the law between wellness apps, such as meditation guides like Calm, which are not banned, and services that are offering mental health support by promising to always be available.
- Some of these apps, such as Ash Therapy, feature disclaimers that the chatbot is not a replacement but market themselves as the "first AI designed for therapy."
Yes, but: Users in Illinois are blocked from setting up a profile on Ash, with a pop up reading: "The state of Illinois is currently figuring out how to set policies around services like Ash. In the meantime, we've decided not to operate in Illinois."
