Anthropic bites back in the compute wars with Amazon partnership
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Allie Carl/Axios
Anthropic is expanding its partnership with Amazon, committing more than $100 billion over the next decade to secure massive new computing capacity.
Why it matters: Compute capacity is the currency of the AI race and is most likely to define who wins.
Driving the news: Anthropic agreed to spend $100 billion to secure up to 5 gigawatts of compute from Amazon to train and run its Claude models.
- Amazon will invest $5 billion now, with the option for up to $20 billion more — deepening its stake in Anthropic.
Between the lines: Anthropic is signaling it's ready to spend heavily on the same infrastructure edge that its biggest competitor, OpenAI, has been touting.
- OpenAI sent a letter to investors last week pitching its compute capacity as its competitive advantage over Anthropic.
- Anthropic, for its part, has pointed to a wave of partnerships aimed at expanding access, now including this latest Amazon announcement.
Catch up quick: Compute is finite, and needs to be used to both service customers and train upcoming models, so it both determines how well your current model runs and shapes upcoming model performance.
- When demand spikes, as it has for Anthropic's Claude Code, that capacity is strained.
- Anthropic changed its enterprise pricing in response, charging more for super users. Some consumers have reported worse experiences using Claude, which they attribute to compute constraints.
What we're watching: How long the AI labs rely on partners that are also competitors for access to compute.
- Amazon has its own AI race to win.
- That may become a problem for Anthropic over time, potentially forcing it to spend even more money on its own compute.
The bottom line: To monitor the AI race, watch compute capacity.
