Don't get used to cheap AI
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Brendan Lynch/Axios
AI may never be as cheap to use as it is today.
Why it matters: AI companies are hooking users with low prices that won't last — straight out of the Amazon and Uber playbook.
The big picture: The push to show profits before IPOs could end the era of cheap AI.
- "These LLM companies are going to go public and they're going to raise prices because they have to," May Habib, CEO of Writer, told Axios.
State of play: New models from OpenAI, Google and Anthropic are generally getting faster and cheaper.
- The industry was fixated on training chips. Now Nvidia and its rivals are focused on inference — the computing that lets models answer your questions.
- Aggregate token pricing — the cost of generating text — has fallen, partly due to a massive efficiency jump in inference.
Driving the news: Nvidia is expected to unveil a more efficient AI chip at its developer conference next week, according to reports.
- As prices fall, usage is surging — and total corporate AI spending is rising, according to Ramp, which tracks business expenses.
Zoom in: Yet margins are still negative for AI labs, according to PitchBook.
- OpenAI is projected to burn $14 billion in 2026, up from $8 to $9 billion in 2025.
- Anthropic's margins have swung from -94% in 2024 to about +40% in 2025, though they remain pressured due to higher than expected inference costs.
Zoom out: Fierce competition has pushed labs to price aggressively, squeezing profits.
- In February, 90% of VC funding dollars went to AI startups. OpenAI and Anthropic alone captured 74% of VC dollars, according to Crunchbase.
- Labs also get discounted compute through strategic partnerships — sometimes described on Wall Street as circular financing.
- Even with those discounts, OpenAI and Anthropic are still losing money. Microsoft reportedly provides OpenAI compute at below-market rates.
Between the lines: Every time you send a complex query, the AI lab is effectively losing money on the transaction.
- Free accounts have limited token use, which is expanded when you sign up for a standard consumer subscription. But those low-cost subscriptions are among the most heavily subsidized.
- In February, 28% of corporate OpenAI chat spending flowed through personal consumer plans rather than higher-margin enterprise tiers, according to Ramp.
- Ramp's data shows Anthropic capturing the majority of tracked business AI spend.
Flashback: Silicon Valley has seen this movie before.
- The so-called "millennial lifestyle subsidy" meant VC money helped underwrite cheap Uber rides and DoorDash deliveries.
- Before that, Amazon built its base with low prices, free shipping and, for years, no sales tax in most states.
- Eventually, all of these companies had to charge enough to cover costs — and make a profit.
Follow the money: The current iteration of AI subsidies won't last forever.
- Both OpenAI and Anthropic are widely expected to go public. Public investors will demand earnings growth and expanding margins.
- Even as chips get more efficient, total spending keeps rising. Labs need more capacity, more upgrades and more supply to meet demand.
The bottom line: The costs of AI will keep going down.
- But total spend from customers will need to keep going up if AI companies are going to become profitable and investors are ever going to get returns on their massive investments.
