OpenAI’s Chief Executive Officer is branching out into a new AI hardware venture. While still maintaining governance over the world’s favorite AI firm, Sam Altman aims to build a ‘global network’ of AI chip manufacturing sites. The semiconductor fabrication plants (fabs) will produce the hardware that powers the artificial intelligence of the future.
Could OpenAI make 2024 the year of AI hardware?
OpenAI CEO Sam Altman intends to create a global infrastructure for the manufacture of AI chips. AI chips are specialized hardware designed for their efficiency in artificial intelligence processing. Typically this means enterprise-grade graphics processing units (GPU) like the NVIDIA HGX H200 Tensor Core GPU that can be strung together in their thousands. The chief executive has already concluded conversations with a number of investors, although the contents of those discussions remain private. Bloomberg reports that Abu Dhabi-based G42 is included in that number.
AI chatbots such as OpenAI’s own, ChatGPT, are ultimately just software running on hardware. By owning the ‘full stack’ that its business relies on, similar to partner company Microsoft which also develops hardware for its own software, OpenAI can accelerate the development of its software products while minimizing reliance on other businesses.
It is unclear at this time whether the semiconductor fabrications (fabs) business will be a subsidiary of OpenAI itself.
Essential AI Tools
Content Guardian – AI Content Checker – One-click, Eight Checks
Originality AI detector
Jasper AI
WordAI
Copy.ai
Why is Sam Altman building a network of ‘AI chip’ factories?
The move is inspired by an ongoing bottleneck in the AI R&D pipeline. Artificial intelligence takes a great deal of computing power, both in training the models and then in maintaining a service at scale. Predicting no end to the shortage of ‘AI-accelerator’ chips, Altman has taken it upon himself to create the specialized semiconductors that the world will soon run on.
Meanwhile, big tech firms are investing heavily in AI hardware this year — and it’s not even February.
Google announced plans to build a $1 billion AI data center at WEF, this January 2024. At the same time, Meta has placed what could be the world’s largest order for NVIDIA H100 AI accelerator GPUs — a staggering 350,000.
If Elon Musks’ prediction of roughly ‘1 billion humanoid robots on Earth by the 2040s’ is correct, the new venture will no doubt be just as valuable as OpenAI itself.