Home > News

“We have more than enough” Nvidia dismisses rumors of short supply for its AI chips

The green team throws cold water on rumors about H100 and H200 shortages
Last Updated on
“We have more than enough” Nvidia dismisses rumors of short supply for its AI chips
PC Guide is reader-supported. When you buy through links on our site, we may earn an affiliate commission. Read More

NVIDIA has pushed back against recent media reports suggesting it could not keep up with demand for its H100 and H200 GPUs, its most in-demand AI chips. In a statement posted on X, the company clarified that it has ample supply of both models and can fulfil all incoming orders without delays. 

The company’s stock price has been under a bit of pressure recently (relatively speaking of course; it’s still the biggest company in the world by market cap) due to some disappointment after its last earnings report, and general talk of an AI bubble, so it’s understandably been particularly keen to squash these rumors.

The company stressed that the production and sale of the H20 chip (a China-specific GPU designed to comply with export rules), has no bearing on the availability of its flagship H100, H200, or newer Blackwell chips. NVIDIA emphasized that these product lines are manufactured and allocated independently. Rumors of shortages stemmed from reports that China’s AI sector was shifting towards repurposing older NVIDIA GPUs following U.S. restrictions on high-end hardware. There have also been reports of Chinese firms ditching H20 in favor of domestic AI chips.

Nvidia’s H-series GPUs explained

The H100, H200, and H20 are all part of NVIDIA’s Hopper architecture, named after computer science pioneer Grace Hopper, which succeeds the earlier A100 (Ampere) GPUs. The architecture is specifically built for training and running massive AI models, such as large language models (LLMs), generative AI, and scientific simulations. Compared to the A100, the H100 is roughly three to six times faster in many AI workloads, particularly in training large models.

NVIDIA H100

The original flagship AI accelerator for global markets, the NVIDIA H100 is often described as the ‘engine of the AI boom’, enabling companies to train and deploy cutting-edge AI models at scale. IKey features include a Transformer Engine, HBM3 memory, and NVLink support, plus massive speedups for generative AI and HPC tasks. It is widely adopted in data centers and cloud platforms.


Deals season is here folks, and with it comes a plethora of eye-catching price cuts on some of the industry's most popular tech. Below are some of the best deals you can find right now.

*Prices and savings subject to change. Click through to get the current prices.


NVIDIA H20

The H20 is a China-specific variant of NVIDIA’s Hopper line, created to comply with U.S. export restrictions. The card has lower performance than the H100, with reduced interconnect bandwidth and fewer active cores. Despite this, it still supports advanced AI workloads, offering Chinese firms a solid alternative.

NVIDIA H200

The H200 is an upgraded Hopper GPU that succeeds the H100, adding HBM3e memory for significantly higher bandwidth and capacity, which improves performance in training and inference of large models. While the architecture remains similar, memory expansion makes it more efficient at handling the growing size of next-generation AI workloads.


1265
Still using an AM4 processor?

AM4 users, what kind of CPU are you using right now?

About the Author

Aaron's laptop knowledge makes him the go-to guy on PC Guide. But he still finds time for features, deals and much more.