AI hallucinations rock Iowa, federal courts
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Lindsey Bailey/Axios
The Iowa Supreme Court Attorney Disciplinary Board recently asked justices to strike multiple filings from a former Polk County attorney candidate, alleging he cited an "imaginary case" — apparently generated by AI — in at least three separate filings.
Why it matters: The Des Moines case shows how concerns about AI-generated "hallucinations" have extended to lower courts, beyond high-profile federal cases.
- When fake citations creep into legal filings, the risk extends beyond lawyers' reputations — it can disrupt cases, waste judicial resources and deny parties fair outcomes, former Iowa Supreme Court Justice Brent Appel tells Axios.
State of play: The Iowa case involves Des Moines attorney Royce David Turner, the Iowa Capital Dispatch reports.
- Turner is accused of using at least one fake citation in an attempt to regain his professional license, which was suspended in 2018 for allegedly mishandling cases and not following court directives, according to the Dispatch.
- He did not return Axios' request for comment last week.
Context: Turner legally changed his first name from Royce to David in 2020, court documents reviewed by Axios show.
- His name was removed from the Democratic primary ballot in 2022 following a challenge over voter signatures and his eligibility due to lacking a valid law license.
Catch up quick: Some generative AI tools like ChatGPT don't search legal databases. They predict text patterns, which means they can produce plausible-looking but fabricated case names, citations or quotes when prompted, according to LexisNexis, a legal research platform.
- The "hallucinations" often appear authentic — formatted with case numbers and court jurisdictions — making them easy to overlook if an attorney doesn't double-check.
- Some of the most well-known hallucination cases involve President Trump's former personal attorney, Michael Cohen, and lawyers for the CEO of My Pillow.
The intrigue: Academic research shows that even legal-focused AI tools can still hallucinate.
Driving the news: Federal judges are increasingly cracking down on attorneys who cite fake AI cases, with fines reaching up to $15,000, according to an article published last month by Law360, a legal news service.
What they're saying: Appel, who is now a lecturer at Drake University's law school, says AI is helpful for summarizing lengthy records but cautions students that human verification remains essential.
The big picture: Some judges are calling for harsher professional penalties for those who fall victim to AI hallucinations, according to Law360.
- "In the court's view, it demands substantially greater accountability than the reprimands and modest fines that have become common as courts confront this form of AI misuse," U.S. District Judge Anna M. Manasco of the Northern District of Alabama wrote in a recent filing.
