Home > AI

Microsoft AI chip ‘Maia AI accelerator’ announced at Ignite conference

Microsoft has their own AI silicon

Reviewed By: Kevin Pocock

Last Updated on November 16, 2023
Microsoft announces ai chips via Microsoft Ignite.
PC Guide is reader-supported. When you buy through links on our site, we may earn an affiliate commission. Read More
You can trust PC Guide: Our team of experts use a combination of independent consumer research, in-depth testing where appropriate - which will be flagged as such, and market analysis when recommending products, software and services. Find out how we test here.

The new Microsoft AI chip revealed at the Microsoft Ignite conference will “tailor everything ‘from silicon to service’ to meet AI demand”. With MSFT data centers seeing a world-leading upgrade in the form of NVIDIA’s latest GHX H200 Tensor core GPUs, and a new proprietary AI-accelerator chip, the Microsoft Azure cloud computing platform becomes the one to beat.

Microsoft AI chip announced at the Microsoft Ignite conference

In fact, two new chips emerged from their secretive development dens this week: “the Microsoft Azure Maia AI Accelerator, optimized for artificial intelligence (AI) tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor tailored to run general purpose compute workloads on the Microsoft Cloud.”

These two custom-designed chips put the tech giant in a stronger position in the AI race against NVIDIA and AMD, each with their own AI hardware supply chain.

✓ Steve says

Microsoft Azure takes the lead

The new proprietary Maia AI accelerator chip, in addition to upgraded GHX H200 servers at Microsoft data centers, will make the Azure cloud computing platform the most powerful of its kind anywhere in the world.

Scott Guthrie is the Executive Vice President of Microsoft’s cloud & AI group. At Microsoft Ignite on Wednesday November 15th, Guthrie revealed that “Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centers to meet the needs of our customers.” He continued that “at the scale we operate, it’s important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain and give customers infrastructure choice.”

Essential AI Tools

Editor’s pick
Only $0.00019 per word!

Content Guardian – AI Content Checker – One-click, Eight Checks

8 Market leading AI Content Checkers in ONE click. The only 8-in-1 AI content detector platform in the world. We integrate with leading AI content detectors to give unparalleled confidence that your content appear to be written by a human.
Only $0.01 per 100 words

Originality AI detector

Originality.AI Is The Most Accurate AI Detection.Across a testing data set of 1200 data samples it achieved an accuracy of 96% while its closest competitor achieved only 35%. Useful Chrome extension. Detects across emails, Google Docs, and websites.
EXCLUSIVE DEAL 10,000 free bonus credits

Jasper AI

On-brand AI content wherever you create. 100,000+ customers creating real content with Jasper. One AI tool, all the best models.
TRY FOR FREE

WordAI

10x Your Content Output With AI. Key features – No duplicate content, full control, in built AI content checker. Free trial available.
TRY FOR FREE

Copy.ai

Experience the full power of an AI content generator that delivers premium results in seconds. 8 million users enjoy writing blogs 10x faster, effortlessly creating higher converting social media posts or writing more engaging emails. Sign up for a free trial.

The Microsoft Azure Maia AI Accelerator

Every since the release of OpenAI’s ChatGPT, tech firms around the world have been scrambling for resources, both human and technological, to keep pace in the AI race. Google has Bard with it’s respective PaLM 2 LLM (large language model). xAI now has Grok being Beta tested as we speak. Amazon is rumored to be secretly training the largest model yet, codenamed Olympus. Microsoft itself has Bing Chat, powered by the same GPT-4 LLM as ChatGPT, in addition to various other AI-powered services such as Microsoft Copilot and the Azure OpenAI service. The custom silicon, which will form an integral part of Azure’s end-to-end AI architecture, ensures a comprehensive stance on the firms AI services.

“We have visibility into the entire stack, and silicon is just one of the ingredients.”

Rani Borkar, CVP for Azure Hardware Systems and Infrastructure

Microsoft also demonstrated “a preview of the new NC H100 v5 Virtual Machine Series built for NVIDIA H100 Tensor Core GPUs, offering greater performance, reliability and efficiency for mid-range AI training and generative AI inferencing. Microsoft will also add the latest NVIDIA H200 Tensor Core GPU to its fleet next year to support larger model inferencing with no increase in latency.”

The implementation of these cutting-edge AI chips will speed up inference for AI workloads by as much as 1.9x, even compared to the relatively recent H100 servers.

Nvidia HGX H200 is a highly powerful server platform designed by NVIDIA.
NVIDIA HGX H200 performance results for modern LLM’s.

Steve is an AI Content Writer for PC Guide, writing about all things artificial intelligence. He currently leads the AI reviews on the website.