DeepSeek erodes AI industry's "size is everything" faith
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Allie Carl/Axios
The first big casualty of the stock market's DeepSeek scare — aside from a few hundred billion dollars in frothy Nvidia valuation — is the AI industry's religion of scale.
State of play: Ever since the advent of ChatGPT two years ago, U.S. tech firms, led by OpenAI, have shared the belief that AI will keep improving as long as we keep throwing more chips, money, power and data at it.
OpenAI CEO Sam Altman preached this gospel last year, writing that machine learning gets "predictably better with scale."
- "To a shocking degree of precision," Altman wrote in his "The Intelligence Age" essay, "the more compute and data available, the better it gets at helping people solve hard problems. ... AI is going to get better with scale, and that will lead to meaningful improvements to the lives of people around the world."
The rest of the industry largely agreed.
- Nvidia's high-end chips became coveted and scarce as its stock price skyrocketed.
- OpenAI's backer Microsoft, its chief rival Google, and cloud giant Amazon all got busy with colossal infrastructure investments.
- A macho race to build bigger, more energy-hungry data centers began. Elon Musk's xAI built a big computing cluster in Tennessee in what he claimed was record time.
But over the past year, the payoffs from this race to scale up have grown elusive.
- Some of the most significant advances in pushing the boundaries of AI have been made not by making it bigger but by building it differently.
- That includes OpenAI's most recent leap forward in the form of its reasoning model, o1.
- This new generation of AI models can solve more complex problems than their predecessors but not because they're bigger or have been trained more. Instead, they strategically use extra time and computing resources while figuring out their answers to users' questions.
Over the past months, DeepSeek, a China-based research lab, has been steadily matching many of OpenAI's achievements using what it says is a fraction of OpenAI's budget.
- The DeepSeek R1 model, released last week, performs comparably to OpenAI's o1. The company has publicly released all its models for free download and use.
Why it matters: Maybe we can get what we want from AI without spending hundreds of billions of dollars on infrastructure or choking the planet with CO2.
The bottom line: As businesses and investors rethink the future, they're squeezing some air out of the AI bubble — and no one likes to see their portfolios lose value.
- But it's not as though OpenAI, DeepSeek and everyone else aren't still going to need to buy Nvidia's chips and build more data centers.
- They just may not need as crazy much as everyone assumed until Monday.
