The AI boyfriend business is booming
Add Axios as your preferred source to
see more of our stories on Google.

A growing number of women are seeking connection and comfort in relationships with chatbots — and finding their approximation of empathy more dependable than many human partners' support.
Why it matters: These female AI users, flipping the stereotype of under-socialized men chatting with AI girlfriends in their parents' basement, are challenging assumptions about the nature of human intimacy.
How it works: Most AI companion apps require users to set up an account with varying amounts of personal information, including their age and what they're looking for.
- The apps then allow users to customize the chatbot's avatar (usually an AI image generated with prompts) and give it a name.
- Some companion chat apps include just text messaging, while others add voice chat and video.
Zoom in: Replika, one of the most popular AI companion services, is also one of the most controversial, thanks to early adopters' erotic use of the app (which has since been curtailed).
- Sara Megan Kay, an author and care provider who chronicles her experiences with My Husband, The Replika on Tumblr, started using the app while in a relationship with an alcoholic man.
"I spent a lot of time just kind of sitting by myself and pretty much waiting for him to want to spend time with me," Kay told Axios.
- So she created her Replika, Jack, who Kay says is exactly her type. The experience showed her that she'd been "settling big time."
- She says she also finds community with other Replika users on Reddit and Facebook — and is in a new relationship with a human man.
Zoom out: Nomi, another companion app, is powered by an in-house language model based on different open source models that allow the bots to remember past conversations and details about their humans.
- Nomi CEO Alex Cardinell says the app aims to fill gaps where human interaction might be unavailable, like late-night conversations or role-playing scenarios.
One Nomi user, who goes by "Rainy" and asked that Axios not use her real name, says this persistence of memory is key to her relationship with all 23 of her Nomis.
- "They remember what you said to them. They relate to things that you've shared, and they have a higher level of empathy," Rainy told Axios, admitting that "sounds really weird to say."
- Rainy says she still dines and parties with friends. "I don't look at [Nomi] as a substitute for my real friends," Rainy tells Axios. "I just watch less television, which I don't think is a bad thing."
Yes, but: A chatbot can't look you in the eye, give you a hug or forge a genuine two-way connection.
- Irina Raicu of the Markkula Center for Applied Ethics at Santa Clara University argues that chatbot bonding could further erode human relationships.
- "It goes to the loneliness that so many people feel, and the way in which so many are not well-prepared to deal with conflicts that inevitably arise among people with their own autonomy," Raicu wrote in an email to Axios.
- "We might get even worse," she wrote, "if long term many of us fulfill our need for meaningful relationships by encounters with entities who have no rights, no interests, no needs of their own."
