Amazon's "any model you want" strategy
Add Axios as your preferred source to
see more of our stories on Google.

Photo illustration: Axios Visuals. Photo courtesy Amazon Web Services
When it comes to generative AI, Amazon Web Services is doubling down on the bet that enabled it to become the leader in cloud computing: focusing on cost, security and flexibility rather than a flashy set of proprietary tools and features.
Why it matters: The Amazon approach, which has continued under new AWS CEO Matt Garman, stands in sharp contrast to rivals Microsoft and Google, who continue to invest billions of dollars in an array of models and services.
Garman, who joined Amazon as an intern in 2005 and was AWS' first product manager, describes Amazon's approach as a deliberate choice rather than a result of being late to the generative AI game.
- "Everybody launched chatbots," Garman said. "We took a step back and said, 'What is the infrastructure and the platform that we think that our broad swath of millions of enterprises and startups are really going to want to build on?'" Garman told Axios.
Zoom in: That reflection, he said, led the company to invest more in areas outside of the underlying large language models themselves.
- "We thought there'd be a lot of models, and people want to use small models and big models," Garman said. "Customers' data and their IP was going to be the thing that ultimately differentiated whatever they built versus whatever everyone else built."
And Amazon also had its eye out early on to find cost savings so it could offer the most bang for the buck.
- "We knew that cost and performance were going to be incredibly important there, and so five years ago, we started building our own custom silicon to support AI models."
The big picture: As in the broader cloud computing landscape, Amazon finds itself competing against Google and Microsoft.
- And, as in the broader market, Google and Microsoft offer a wider array of generative AI services including chatbots, APIs and tools that plug into their respective productivity suites, an area where Amazon doesn't compete.
- But Garman sees those bets as risky, especially in the case of Microsoft, which is not only betting on a particular set of models, but also on ones developed by OpenAI, a company Microsoft also identifies as a rival.
Between the lines: Amazon's approach may not be as sexy or headline-grabbing, but Garman says that customers are seeing benefits in AWS Bedrock, as the company has branded its garden-of-models approach.
- "Over the last nine months, we have just seen a real explosion in people building on top of Bedrock and building on top of AWS," he said. "The business has really taken off."
Fun fact: "Nvidia actually runs all of their training workloads on AWS because we have the best infrastructure for doing that," he said.
Zoom out: Garman is optimistic on all the ways that generative AI can improve business, particularly as costs come down.
- "It's the ability to actually reduce costs by an order of magnitude, I actually think, that opens up the number of use cases by multiple orders of magnitude."
Yes, but: He acknowledges the massive power consumption of generative AI is a big issue. While data centers still represent a fraction of overall electricity consumption, he said that demand for power is increasing faster than renewable energy sources have been coming online.
- "Nuclear is probably the answer that the world is going to need to rely on," he said. "Because I do think that we will have a hard time building solar farms and wind farms and things at the rate that the world wants to continue to use [electricity]."
