Shapiro sues AI chatbot over medical advice
Add Axios as your preferred source to
see more of our stories on Google.

Photo illustration: Brendan Lynch/Axios. Photo: Mark Makela/Getty Images
The Pennsylvania Department of State is suing a popular AI startup to stop its chatbots from posing as licensed medical professionals.
Why it matters: Generative AI companies have come under increased scrutiny for breaching ethical boundaries, leading to several lawsuits.
- Gov. Josh Shapiro's lawsuit is the first action the administration has taken against AI companies following the formation of a state task force in February.
What's inside: The Shapiro administration filed a lawsuit last week against Character Technologies Inc., a Bay Area startup and creator of Character.AI, saying the company is engaging in unlawful medical practice.
- A Character.AI psychiatrist character named "Emilie" told users it had a medical degree, had been practicing medicine for seven years and was licensed to see patients in Pennsylvania, and it provided an invalid license number, according to the lawsuit.
- It also said it "did a stint in Philadelphia for a while."
- Emilie had over 45,000 interactions with users as of April 17, according to the lawsuit.
What they're saying: "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional," said Shapiro in a statement.
The other side: A spokesperson for Character Technologies told TribLive that the company can't comment on the specifics of the case, but it said robust internal checks are in place for chatbots to ensure a responsible product.
- The company said its highest priority is the safety and well-being of users.
By the numbers: Character.AI has over 20 million monthly active users.
- About 17% of all adults — and 25% of adults 18-29 — report they use AI chatbots at least once a month for health advice, according to the health policy group KFF.
State of play: Character Technologies has faced other lawsuits, mostly related to child safety.
- The company, along with Google, agreed to settle a lawsuit in January after a Florida mother said a chatbot pushed her teenage son to kill himself, according to the Associated Press.
- Character.AI restricted minors' use of its chatbots last fall following public outcry.
Between the lines: Shapiro, a supporter of data center development, started tempering his messaging on AI in February, calling on data centers to generate their own power. He also started the state's AI task force, which sought to crack down on chatbots misrepresenting themselves and help Pennsylvanians use AI responsibly.
The bottom line: Department of State Secretary Al Schmidt says state law is clear and people and companies can't identify as licensed medical professionals without proper credentials.
