Most Americans don't want AI doing everything
Add Axios as your preferred source to
see more of our stories on Google.


Americans say they're comfortable with AI helping develop new medicines or forecast the weather, but most don't think it should play a role in relationships, religion or creative tasks, according to a new report released Wednesday from Pew Research Center.
Why it matters: Americans are starting to set boundaries for what they want to turn over to AI — and what they don't.
By the numbers: Americans see AI as a way to boost efficiency, but they remain wary of handing over decisions tied to values, relationships or democracy, Pew found.
- 74% of Pew respondents said AI should play a role in weather forecasting and 70% said it should help with detecting financial crime.
The other side: 66% said AI should not judge whether two people could fall in love.
- 60% oppose AI having a role in making decisions about how to govern the country.
The big picture: Since 2021, Pew has been asking participants how they feel about the increased use of AI in daily life.
- In 2023, 2024 and 2025, about half of Americans said they were more concerned than excited about it.
- That's up from 2021 and 2022 — before the launch of ChatGPT — when only 37 and 38 percent said they were more concerned than excited.
- Only a quarter of Americans now say the benefits of AI are "high" or "very high." Of that 25%, "efficiency gains" was the most commonly cited benefit.
Between the lines: Participants in the study also said AI will weaken core human skills.
- 53% of Americans believe AI will erode people's creativity.
- 50% say it will hurt users' ability to form meaningful relationships.
Stunning stat: Although Americans told Pew they want to put boundaries on AI's role in our lives, more than half said they're "not at all" confident or "not too confident" they can detect whether content was created by a human or a bot.
- Most Americans say it's "extremely" or "very important" to be able to determine if content is AI-generated.
The intrigue: Chatbot interfaces are so open-ended that users who turn to the tools for research may be drawn into using them for more personal, creative or emotional questions.
- Anthropic studied how people use its Claude chatbot and found that "in longer conversations, counseling or coaching conversations occasionally morph into companionship — despite that not being the original reason someone reached out," per a report released in June.
What they're saying: Physicians and educators tell Axios that there are real risks in inviting chatbots deep into our personal, creative and religious lives.
- AI is "going to fundamentally change us as a human species," pediatric physician Dana Suskind told Axios in a July interview. "I think we need to be incredibly intentional, because we may end up in a place that we never imagined."
- It's "not just writing, but reading too, that's being affected so profoundly by these technologies," says Bruce Holsinger, English professor at University of Virginia and author of "Culpability," a novel about a family's struggles with the impact of AI.
- Instead of "slowing down and thinking and breaking apart a sentence for ourselves... you just pop a PDF in and say, 'Tell me the main points. What are the takeaways?'" Holsinger told Axios in an interview last month.
State of play: Americans are cautiously experimenting with AI, but the Pew data mirrors a growing skepticism of chatbots used for creative pursuits, therapy and intimate companionship, especially when it comes to teens and children.
- AI makers say their tools are "great for brainstorming," but new studies find that chatbots produce a more limited range of ideas than a group of humans would.
- The FTC is investigating AI chatbot safety, demanding details from seven companies — including OpenAI, Meta, Google and xAI — over potential harms to kids.
- After repeated reports of chatbots causing or intensifying delusional behaviors and suicidal thoughts, OpenAI says it's working to improve how its models recognize and respond to signs of mental and emotional distress.
