Axios Vitals

March 16, 2026
Welcome back, Vitals gang. Today's newsletter is 864 words, a 3.5-minute read.
1 big thing: Autism therapy boom spurs crackdown
States and insurers that administer Medicaid are scrutinizing autism therapy after spending quadrupled over five years amid a proliferation of fraudulent billings.
- The scrutiny is also driven by private equity expansion and reports of subpar care.
Why it matters: Moves to cap payments and drop some providers from Medicaid mark a pivotal moment for the program, but they could leave some patients and their families in the lurch.
State of play: The focus is a form of intensive, personalized therapy called applied behavioral analysis that's mostly aimed at young children to help them learn social skills and reinforce positive behavior.
- Demand for the services has surged, along with Medicaid reimbursement rates, which are higher than other behavioral health services, the Wall Street Journal recently reported.
- It's causing a fiscal crunch in states like Nebraska, where spending on the therapy grew 2,000% in five years.
- Private equity firms have acquired more than 500 autism therapy centers over the past decade, mostly in states with higher rates of autism diagnoses that have fewer limits on insurance coverage.
Providers say the lack of widely accepted guidelines for autism care and the fee-for-service reimbursement system is creating openings for overbilling.
- "There are some children who need a 25-, 30-, 35-hour program. A lot do not," said Neil Hattangadi, CEO of Cortica, an autism care company that offers applied behavioral analysis alongside other services.
- Federal health watchdogs have identified tens of millions of dollars in improper payments for applied behavioral analysis in Colorado, Wisconsin and Indiana.
Zoom in: Private equity-backed Action Behavior Centers billed Colorado's Medicaid program for some of the most hours per patient of any autism provider, the Wall Street Journal review of Medicaid data found.
- There's also been outright fraud: A Minnesota woman who took $14 million in Medicaid funds for a fraudulent autism care scheme pled guilty this month.
2. The promise and peril of AI scribes
Billions of dollars have poured into AI transcription and billing startups on the promise of better care, lower costs, happier clinicians and healthier patients.
- But some tools are hallucinating procedures and episodes, entering them into medical records and creating more tedium for clinicians — and in some cases, actually increasing hospital spending, Erin Brodwin wrote first on Pro.
Driving the news: A recent Blue Cross Blue Shield Association study found AI coding and billing platforms were inflating outpatient diagnoses and reimbursement, adding a potential $663 million in costs within a year.
- The study also found an increase in coding for acute posthemorrhagic anemia, a condition that typically requires treatment like a blood transfusion.
- The data analyzed also showed many patients flagged with this diagnosis never received blood transfusions.
Inside the room: One patient told Axios Pro that after a recent appointment at Johns Hopkins, which she understood to be recorded by the AI scribe Abridge, her medical record included an abdominal exam that never happened.
AI coding and billing tools "may be causing a mismatch between the care patients receive and the bills hospitals send to insurance companies," the Blue Cross Blue Shield Association study noted.
If you need smart, quick intel on health tech dealmaking for your job, get Axios Pro.
3. Microsoft unveils always-on medical assistant
Microsoft is joining the push into AI-enabled health care with a service that lets users upload electronic health records and data from devices on to a portal and delivers personalized advice.
The big picture: The tech giant's Copilot Health arrives as OpenAI, Amazon and others expand their medical chatbot offerings.
Driving the news: Microsoft likens its service to an always-on assistant that can synthesize records, lab results and data from fitness devices and answer questions about sleep, activity or vital signs without making a formal diagnosis.
- The catch: People have to be willing to hand over their full medical histories to an AI system.
- The landmark privacy law HIPAA doesn't apply to AI chatbots. Microsoft says the data will be encrypted and will not be used to train its AI models.
How it works: Copilot Health can draw on records from more than 50,000 U.S. health providers and data from 50 different types of wearable devices — including Apple Health, Oura and Fitbit.
- For now, the service will be free and limited to U.S. users. Microsoft eventually sees it as a paid service.
4. Quote du jour
"When examining the durations of restaurant and bar closures, we find that the closures were shorter when states relied more heavily on sales tax revenue and alcohol tax revenue, respectively."— A new analysis in Contemporary Accounting Research on how state tax collections may have influenced COVID-19 stay-at-home orders and other health restrictions. (h/t CIDRAP)
5. While you were weekending
📢 The White House is taking a more active role controlling HHS messaging and policies following miscues, including grant cuts for mental health. (WSJ)
👾 Medical device giant Stryker is trying to restore operations after an Iran-linked cyberattack disrupted its manufacturing and shipping. (MedTech Dive)
🐓 To combat bird flu spread, other countries have authorized poultry vaccines, but the U.S. has held off amid political and economic opposition. (ProPublica)
Thanks for reading Axios Vitals, and to editors Adriel Bettelheim and David Nather and copy editor Matt Piper. Please ask your friends and colleagues to sign up.
Sign up for Axios Vitals









