How AI turned a Utah police officer into a frog
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Rebecca Zisser/Axios
Artificial intelligence is increasingly affecting who police stop, how reports are written, where officers patrol and how evidence is analyzed.
- And in Utah, it claimed an officer turned into a frog.
Catch up quick: An AI program generated the shapeshifting incident report for Heber City Police last month, FOX 13 reported.
- The officer's body camera picked up background noise at a scene where the Disney film "The Princess and the Frog" was playing, police said.
- The movie dialogue flummoxed the department's AI writing program, which concluded in its summary that the officer had become a frog.
What they're saying: "That's when we learned the importance of correcting these AI-generated reports," Sgt. Rick Keel told FOX 13.
Driving the news: Departments from San Francisco to Georgia are piloting generative-AI tools that turn body-cam audio into written narratives, shaving hours off paperwork and getting officers back on patrol amid staffing shortages.
Zoom out: Those tools join AI-adjacent technologies such as drones, license-plate readers and gunshot-detection systems.
Context: "When a technology wave hits law enforcement, it hits it hard," said Guillaume Delepine, CEO of the startup Longeye, which is testing AI systems that comb through hours of jail phone calls, interviews and police footage for evidence.
- About 70% of investigators say they don't have time to review all the digital evidence they collect, leaving critical phone calls, interviews and device data untouched, Delepine told Axios.
Yes, but: While AI promises efficiency, its rapid spread could embed errors deep into the criminal justice system — and not just fairytales.
Threat level: Civil liberties experts warn that AI tools risk reinforcing bias — and it's unclear who controls the data collected by AI systems without legal guardrails.
- "The digital space has evolved in a way that wasn't necessarily anticipated when (privacy) laws were implemented," Beryl Lipton, a researcher at the Electronic Frontier Foundation, tells Axios.
Case in point: Austin, Texas, paused a plan for AI-enhanced camera systems in parks — that would "analyze behavior" to spot potential crimes — over questions about civil liberties and efficacy, per KUT.
The bottom line: "We tell our agencies, 'Don't bring the chatbot to court," Delepine said. "Use the chatbot to find the actual moment in the evidence that matters, and bring that to court.'"

