Why it matters: The science is still out on whether this is emotionally healthy, says Jason Thatcher, a CU Boulder information systems professor who is studying how people interact with artificial intelligence.
The intrigue: In his research, Thatcher is exploring several kinds of relationships people form with AI chatbots, from supervisory roles and trusted research assistants to therapists and lovers.
It's well established that people tend to treat AI systems like ChatGPT, Claude, Google Gemini and Microsoft Copilot as humans, he tells Axios.
They can become romantically entangled, which isn't necessarily a bad result. The bots can provide emotional relief, he says, and help address the nation's loneliness crisis.
What he's saying: "Absolutely, [users] can perform emotional attachments to bots," he says.
"So imagine that your options are: A bot that says all the right things, that only gets better. Or a human who might be a little bit cranky. Who do you pick?"
The bottom line: For some users, a conversation is satisfying. For others, it doesn't substitute for touch and feel, he says.