As AI continues its takeover of technology, it’s only going to become more prominent in our everyday lives. But, having AI become such a massive part of the tech center has some consequences many aren’t thinking about. AI could account for up to 25% of the power usage in the US by 2030, requiring massive amounts of electricity.
Rene Haas, the CEO of Arm, stated to The Wall Street Journal that they believe that if AI stays on the track it’s currently on, the 4% usage it currently takes up of the US’s power usage could grow up to 25% by 2030. This is only backed up by the IEA Electricity 2024 report, which states that power usage from the data center sector will rise dramatically as AI is continually implemented by companies like Google, Meta, and Amazon.
AI power usage is only going up, and it may cause energy consumption to soar
A large contributing factor is Nvidia, whose high-performance AI GPUs are set to be installed all over the world. Nvidia currently dominates the AI server market, where their units consume an average of 7.3 TWh annually. With new Nvidia GPUs continually being installed, this number is only expected to rise exponentially. ChatGPT alone reportedly uses ten times the amount of power needed for a non-AI standard Google search.
With AI becoming more standard in every corner of technology, power usage will rise dramatically. With more firms using Nvidia’s AI GPUs, and more people using ChatGPT in everyday life, electricity usage could spiral out of control.
Luckily, there is hope. Many tech companies and governments are aware of the power usage issue AI presents and are looking for solutions. While some require tons of research, like Small Modular Reactors that are nuclear-powered being installed in AI data centers, others are drastic. Some are considering limiting AI by placing restrictions. Only time will tell which direction this goes in. Hopefully, we’ll be able to have the best of both worlds with both powerful AI and light energy usage.