Arm CEO thinks AI will account for 25% of all power use in the US by 2030
As AI continues its takeover of technology, it’s only going to become more prominent in our everyday lives. But, having AI become such a massive part of the tech center has some consequences many aren’t thinking about. AI could account for up to 25% of the power usage in the US by 2030, requiring massive amounts of electricity.
Rene Haas, the CEO of Arm, stated to The Wall Street Journal that they believe that if AI stays on the track it’s currently on, the 4% usage it currently takes up of the US’s power usage could grow up to 25% by 2030. This is only backed up by the IEA Electricity 2024 report, which states that power usage from the data center sector will rise dramatically as AI is continually implemented by companies like Google, Meta, and Amazon.
Prime Day is finally here! Find all the biggest tech and PC deals below.
- Sapphire 11348-03-20G Pulse AMD Radeon™ RX 9070 XT Was $779 Now $739
- AMD Ryzen 7 7800X3D 8-Core, 16-Thread Desktop Processor Was $449 Now $341
- ASUS RTX™ 5060 OC Edition Graphics Card Was $379 Now $339
- LG 77-Inch Class OLED evo AI 4K C5 Series Smart TV Was $3,696 Now $2,796
- Intel® Core™ i7-14700K New Gaming Desktop Was $320.99 Now $274
- Lexar 2TB NM1090 w/HeatSink SSD PCIe Gen5x4 NVMe M.2 Was $281.97 Now $214.98
- Apple Watch Series 10 GPS + Cellular 42mm case Smartwatch Was $499.99 Now $379.99
- ASUS ROG Strix G16 (2025) 16" FHD, RTX 5060 gaming laptop Was $1,499.99 Now $1,274.99
- Apple iPad mini (A17 Pro): Apple Intelligence Was $499.99 Now $379.99
*Prices and savings subject to change. Click through to get the current prices.
AI power usage is only going up, and it may cause energy consumption to soar
A large contributing factor is Nvidia, whose high-performance AI GPUs are set to be installed all over the world. Nvidia currently dominates the AI server market, where their units consume an average of 7.3 TWh annually. With new Nvidia GPUs continually being installed, this number is only expected to rise exponentially. ChatGPT alone reportedly uses ten times the amount of power needed for a non-AI standard Google search.
With AI becoming more standard in every corner of technology, power usage will rise dramatically. With more firms using Nvidia’s AI GPUs, and more people using ChatGPT in everyday life, electricity usage could spiral out of control.
Luckily, there is hope. Many tech companies and governments are aware of the power usage issue AI presents and are looking for solutions. While some require tons of research, like Small Modular Reactors that are nuclear-powered being installed in AI data centers, others are drastic. Some are considering limiting AI by placing restrictions. Only time will tell which direction this goes in. Hopefully, we’ll be able to have the best of both worlds with both powerful AI and light energy usage.