Ignite 2023: Microsoft Announces Two Custom AI Chipsets for the Cloud

Azure Maia 100 and Azure Cobalt 100 chips

During its Microsoft Ignite 2023 keynote address this morning, Microsoft announced Azure Maia and Azure Cobalt, its first two custom chipsets that will power Azure’s AI infrastructure.

“In this new era of AI, we are redefining cloud infrastructure, from silicon to systems, to prepare for AI in every business, in every app, for everyone,” Microsoft general manager Omar Khan said. “We’re introducing our first custom AI accelerator series, Azure Maia, designed to run cloud-based training and inferencing for AI workloads such as OpenAI models, Bing, GitHub Copilot, and ChatGPT. And we’re introducing our first custom in-house CPU series, Azure Cobalt, built on Arm architecture for optimal performance/watt efficiency, powering common cloud workloads for the Microsoft Cloud. From in-house silicon to systems, Microsoft now optimizes and innovates at every layer in the infrastructure stack.”

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

The Azure Maia 100 is the first generation of Microsoft’s custom AI accelerator series, and with over 100 billion transistors, it’s one of the largest chipsets ever built on a 5-nm manufacturing process. The specifics are vague, but Microsoft says that Maia will provide Azure’s AI infrastructure with “end-to-end systems optimization tailored to meet the needs of groundbreaking AI such as GPT.”

The Azure Cobalt 100 is likewise the first generation of Microsoft’s custom in-house CPU series, and its Arm architecture gives it an ideal level of performance per watt efficiency. The chip is a 64-bit design with 128 cores that Microsoft says delivers a 40 percent performance improvement over the current generations of Azure Arm chipsets. It’s so good that it’s already powering Microsoft Teams, Azure SQL, and other Microsoft services.

Related to these chipsets, Microsoft also announced that Azure Boost—its virtualization processes offloading service—is now generally available. It’s now possible to achieve a throughput of 12.5 GB/s and 650,000 input/output operations per second (IOPS) in remote storage performance and 200 GB/s in networking bandwidth using the service (up from 10 GB/s and 400,000 IOPS in the preview).

Tagged with

Share post

Please check our Community Guidelines before commenting

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC