
Image: Microsoft
Microsoft on Wednesday announced two homegrown chip designs that will be put to work in its data centers alongside processors from Intel, AMD and Nvidia.
Why it matters: Cloud powerhouses, including Amazon, Google and others, are increasingly offering customers their own silicon options in addition to those created by traditional chipmakers.
Details: At its Ignite conference, Microsoft announced the Microsoft Azure Maia 100 AI Accelerator along with the Azure Cobalt 100 CPU, an ARM-based chip for other cloud work.
- The Maia chip is designed to "power some of the largest internal AI workloads running on Microsoft Azure," Microsoft said, adding that it is also working with OpenAI to use the chip with that company's technology.
- The chips are being made for Microsoft by TSMC using its 5-nanometer manufacturing process.
Between the lines: Microsoft made clear it isn't looking to the new chips to replace existing suppliers, but rather to offer customers another choice.
- Nvidia chips, in particular, have been in short supply amid strong demand for AI processing power.
- Microsoft said that the "100" in the names suggests that the Maia and Cobalt chips are each designed to be the first in a family of processors.
What they're saying: "Software is our core strength, but frankly, we are a systems company," Microsoft VP Rani Borkar said in a blog post. "We have visibility into the entire stack, and silicon is just one of the ingredients."
The big picture: Microsoft also used the conference to make a number of other AI announcements, including:
- Debuting Copilot AI Studio, which lets Microsoft 365 customers create their own custom AI chatbots — much as OpenAI is allowing its customers to create custom versions of ChatGPT.
- Rebranding Bing Chat as Microsoft Copilot and establishing a new website for the OpenAI-powered chatbot, while Bing will continue to serve as a home for those who want a combined search/chatbot experience.