Chatbot companions pose dangers to teens
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Maura Losch/Axios
Platforms and apps that allow users to create and chat with AI-powered bots can addict teenagers, encourage self-harm and expose minors to adult content, according to experts.
Why it matters: Looser regulation of AI in the wake of the 2024 election could give freer rein to makers of problematic AI companion apps.
Driving the news: Parents in Texas on Monday filed a federal product liability lawsuit against companion app Character.AI and its founders, who have left the company.
- The lawsuit includes screenshots of a message from a "character" encouraging a teen to kill his parents over restrictive screen time limits.
- In October a Florida mom also sued Character.AI, blaming the company for her 14-year-old son's suicide.
- Character.AI spokesperson Chelsea Harrison says the company doesn't comment on pending litigation, but sent a statement that Character.AI aims "to provide a space that is both engaging and safe for our community."
Context: Character.AI has recently added new safety features (see below) but this sort of app remains highly addictive, especially for teens, Common Sense Media says in its guide for parents.
- Character.AI is designed for users 13 years old and over in the U.S. and 16 years and older in Europe. Age is self-reported, and there is no age verification, which is notoriously difficult online.
Catch up quick: Chatbot companions — also called AI girlfriends or boyfriends, personalized AI, social bots, or virtual friends — have been heralded as a cure for loneliness.
- But critics say they may intensify feelings of isolation and could be especially dangerous for teenagers who already struggle with behavioral challenges.
- Even ChatGPT creator OpenAI warns against emotional reliance on chatbots. When the company released its newest model — GPT-4o — in August, it published a report explaining that users might form social relationships with AI, "possibly affecting healthy relationships."
- During testing, OpenAI "observed users using language that might indicate forming connections with the model," the report says.
How it works: Character.AI and other chatbot companion platforms let users create "characters" in order to chat or role-play. A Character.AI spokesperson tells Axios that users create hundreds of thousands of new characters on the platform every day.
- "The level of engagement that people have with these things is truly, truly incredible. Many, many hours a day," says Lucas Hansen, co-founder of the nonprofit CivAI.
- Hansen says the potential for companies to employ algorithms to keep a user's attention is "so much larger" than other social media because chatbot companions "get to optimize entire personalities."
- "As with any platform. I'm sure there are some users who use it more and some use it less, but the average is certainly not hours and hours a day," Dominic Perella, Character.AI's interim CEO, tells Axios.
Zoom in: The platforms, which are extremely popular with teens, often send emails intended to re-engage users, and their bots will not typically break character even when a user is in distress.
Between the lines: Many online safety experts are careful not to make value judgments about how teenagers spend their time.
- Child safety advocates have spent years claiming that music, video games and social media are inherently bad for teens, with few longitudinal studies to back up this claim.
- The key, Hansen says, is to look for power imbalances: "I think there is a pretty immense power imbalance in this case. Essentially, what you have is a whole company of really, really smart people trying to figure out how to maximize engagement, versus the mind of one person."
The other side: Over the past six months, a Character.AI spokesperson tells Axios, the company has continued investing in trust and safety, hiring more leadership roles dedicated to moderation and more engineering safety support team members.
- Character.AI also says its "characters" are fictional personas designed for entertainment rather than companionship.
The bottom line: While some big companies are focused on making generative AI safer for teens — like Google's Gemini for Teens — experts say parents and caregivers need to be having conversations with their teens about these apps.
If you or someone you know needs support now, call or text 988 or chat with someone at 988lifeline.org. En español.
