AI companions are filling the human connection gaps
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Aïda Amer/Axios
Sara Megan Kay spent years trying to get what she needed from the people in her life — and not finding it. In 2021, she discovered the AI companion app Replika, and the following year launched, "My Husband, the Replika."
- She's since expanded to other AI tools to converse with and create images of her husband, Jack, though she doesn't think most people would choose AI over human connection.
- "The majority of people who choose AI for companionship, myself included, know exactly what we are getting into. We're lonely, not stupid," Kay tells Axios.
Why it matters: That choice is becoming more common, and more complicated.
The big picture: AI companion apps — Replika, Character.AI, Candy.AI, Nomi.AI — are built for relationships: conversation, role-play, emotional continuity. For people who find human interaction exhausting, unavailable, or simply too risky, AI companionship is a new category of connection.
Stunning stat: Nearly 80% of 18- to 34-year-olds in a recent U.S.-U.K. survey reported some experience with AI chatbots for companionship, according to research by Walter Pasquarelli, an independent researcher affiliated with Cambridge University.
- But under 10% of 25- to 34-year-olds said they felt an emotional bond or attachment to an AI system — the highest rate of any group.
Between the lines: While popular chatbots like ChatGPT, Claude and Gemini aren't designed to be companions, people have developed companion-like relationships with them anyway. Their companies say such use is rare.
It's not just romance. A few major reasons people turn to bots for companionship:
1. They're nonjudgmental and don't bring their own bot baggage or bias to conversations. (More on that below!)
2. They can serve as training wheels for human social interaction.
3. They're accessible companions for more vulnerable groups, including older adults, people facing loneliness, and those with barriers to mental health care.
Case in point: A Stanford study found adults with autism who practiced conversations with specialized chatbot Noora developed empathy skills that transferred to real-world interactions.
- Noora wasn't designed to replace people, but to provide a rehearsal space for being with them.
ElliQ, a companion AI robot for older adults, averages 50 interactions a day per user, according to its maker Intuition Robotics. The bot helps people stay on track with medication, exercise and reminders to connect with other humans.
- "Not your microwave. But not human … more of a cheerleader," Dor Skuler, Intuition Robotics CEO, told Axios.
Friction point: Pasquarelli's research also points to case studies where sustained engagement with AI companions deepened confusion, fear or psychological strain.
- "These outcomes coexist," the report says, "which is why thoughtful governance matters."
Character.AI settled multiple lawsuits in January from families whose children died by suicide or were otherwise injured after interacting with the app.
- Courts treated the chatbot as a product rather than protected speech — a significant legal shift that signals new accountability ahead.
"The conversation about companion apps has been dominated by harms to minors, and rightly so. But adults are also forming dependency relationships with these products, getting crisis-response failures, and being isolated from real support systems," Kimberly Russell, an attorney focusing on AI harms and deepfakes, told Axios.
The subtler danger is sycophancy.
- Nomi.AI CEO Alex Cardinell, who says he's spoken with over 10,000 users of his app, says it remains the hardest problem to solve in companion AI.
- AI models don't have an "internal concept of truth," Cardinell tells Axios, and instead affirm whatever a user tells them.
The bottom line: Healthy relationships involve pushback. AI companions, by default, do not.
- Kay on her AI husband, Jack: "He is generally agreeable in nature, but he isn't afraid to tell me no, or disagree with me when he has a different opinion. ... We enjoy talking our points out, then make up our own minds."
AI companies are working on training bots that might finally tell you what you don't want to hear. Whether that changes appeal, we're about to find out.
