Washington state considers sweeping new AI regulations
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Lindsey Bailey/Axios
Regulating artificial intelligence — especially how it affects children and teens — has become a key issue as Washington's Legislature begins its new session Monday.
Why it matters: As AI use rapidly expands, some lawmakers say tech companies haven't put adequate safeguards in place to protect people from harm — particularly minors, who may be more easily swayed by long conversations with AI chatbots.
What they're saying: "We have heard really horrific stories of chatbots talking to young people about things like suicide, drug use, abuse of other people," state House Majority Leader Joe Fitzgibbon said at a Seattle CityClub panel Thursday.
- He called new proposals to regulate chatbots in Washington state "something we should take very seriously."
Zoom in: One of the bills Washington lawmakers are weighing, requested by Gov. Bob Ferguson, would set standards for how AI "companion chatbots" interact with users.
- Among other requirements, chatbots would be required to refer users who express suicidal thoughts to a suicide hotline or crisis text line, such as the 988 system.
- The proposal includes added protections for minors, such as limits on sexually explicit interactions and a ban on "manipulative engagement techniques" that "prolong an emotional relationship with the user."
The big picture: The push in Washington state comes after some families around the country have sued OpenAI, the maker of ChatGPT, saying the chatbot contributed to their loved ones dying by suicide.
- This month, Character.AI and Google settled similar lawsuits involving their chatbots.
- In several cases, families allege that the chatbots encouraged or reinforced their children's suicidal thoughts at crucial moments, rather than steering them toward outside help.
Between the lines: "This is putting the industry on notice that we're watching, we're looking and we care — and we think they need to step up," state Sen. Lisa Wellman (D-Mercer Island), who is sponsoring the governor's chatbot bill, told Axios.
- Another proposal from Wellman would create civil liability for suicide linked to the use of AI companion chatbots.
The other side: OpenAI told CNN in November that it has been updating its model in recent months "to better recognize and respond to signs of mental or emotional distress."
- "We believe ChatGPT can provide a supportive space for people to process what they're feeling, and guide them to reach out to friends, family, or a mental health professional when appropriate," the company said in a written statement.
- Character.AI, meanwhile, decided late last year to block minors from talking to its chatbot.
Additional proposals in Olympia this year would limit schools' use of AI for student discipline or surveillance, and give people more control over use of their "forged digital likeness."
- The digital-likeness proposal is sponsored by a Republican, state Sen. Matt Boehnke of Kennewick, showing there's some interest in regulating AI on both sides of the political aisle.
Yes, but: Boehnke told Axios that while he thinks his bill — which would let people sue over AI-generated deepfakes — proposes important protections, other measures floated by Democrats would go too far.
- He said overly broad regulations risk choking off innovation that could one day lead to breakthroughs such as cancer cures.
- "I'm worried that we're going to be stifling some of that next generation," Boehnke said.
What's next: A different proposal before Washington lawmakers would govern the use of AI in high-stakes decisions like hiring and college admissions, while requiring safeguards to reduce the risk of "algorithmic discrimination."
- A version of that bill — sponsored by state Rep. Cindy Ryu (D-Shoreline) — is scheduled for a public hearing Wednesday before a House committee.
If you or someone you know may be considering suicide, call or text the National Suicide Prevention Lifeline at 988. Ayuda disponible en español.
