Managing The Huge Power Demands Of AI Everywhere

Managing The Huge Power Demands Of AI Everywhere

 

Managing The Huge Power Demands Of AI Everywhere

More efficient hardware, better planning, and better utilization of available power can help significantly.

popularity

Before generative AI burst onto the scene, no one predicted how much energy would be needed to power AI systems. Those numbers are just starting to come into focus, and so is the urgency about how to sustain it all.

AI power demand is expected to surge 550% by 2026, from 8 TWh in 2024 to 52 TWh, before rising another 1,150% to 652 TWh by 2030. Commensurately, U.S. power grid planners have doubled the estimated U.S. load forecast, from 2.6% to 4.7%, an increase of nearly 38 gigawatts through 2028, which is the equivalent to adding another two more states equivalent to New York to the U.S. power grid in 5 years.

Microsoft and Google, meanwhile, report electricity consumption has surpassed the power usage of more than 100 countries, and Google’s latest report shows a 50% rise in greenhouse gas emissions from 2019 to 2023, partly due to data centers.

This has put the entire tech sector on a worrisome trajectory. The chip industry had been doing well in terms of the amount of power being consumed for computation, which was matched somewhat with efficiency gains. Until AI, there wasn’t the big push for so much more compute power as is seen today, and many report they were caught by surprise. This may be why there is so much research into alternatives to traditional power sources, even including nuclear power plants, which are now being planned, built, or recommissioned.

“AI models will continue to become larger and smarter, fueling the need for more compute, which increases demand for power as part of a virtuous cycle,” said Dermot O’Driscoll, vice president of product solutions in Arm’s Infrastructure Line of Business. “Finding ways to reduce the power requirements for these large data centers is paramount to achieving the societal breakthroughs and realizing the AI promise. Today’s data centers already consume lots of power. Globally, 460 terawatt-hours (TWh) of electricity are needed annually, which is the equivalent to the entire country of Germany.”

To fully harness the potential of AI, the industry must rethink compute architectures and designs, O’Driscoll says. But while many of the largest AI hyperscalers are using Arm cores to reduce power, that’s only part of the solution. AI searches need to deliver more reliable and targeted information for each query, and AI models themselves need to become more efficient.

“AI applications are driving unprecedented power demand,” said William Ruby, senior director of product management for power analysis products at Synopsys. “The International Energy Agency in its 2024 report indicated that a ChatGPT request consumes 10X of the amount of power consumed by a traditional Google search. We are seeing this play out for semiconductor ICs. Power consumption of SoCs for high-performance computing applications is now in the hundreds of watts, and in some cases exceeding a kilowatt.”

The rollout and rapid adoption of AI was as much of a surprise to the tech world as it was to the power utilities. Until a couple years ago, most people assumed AI was plodding along at the same pace it had been for decades.

“You could argue the internet back in the mid-to-late ’90s was a big life changing thing — one of those once-in-a-generation type technologies,” said Steven Woo, distinguished inventor and fellow at Rambus. “Smart phones are another one. But with AI the ramp is faster, and the potential is like the internet — and in some ways maybe even greater. With so many people experimenting, and with the user base being able to do more sophisticated things that need more power, the semiconductor industry is being asked to try and become more power-efficient. In a lot of ways these architectures are becoming more power efficient. It’s just that you’re still getting dwarfed by the increase in the amount of compute you want to do for more advanced AI. It’s one of those things where you just can’t keep up with the demand. You are making things more power-efficient, but it’s just not enough, so now we must find ways to get more power. The models are getting bigger. The calculations are more complex. The hardware is getting more sophisticated. So the key things that happen are that we’re getting more sophisticated as the model is getting bigger, more accurate, and all that. But a lot of it now is coming down to how we power all this stuff, and then how we cool it. Those are the big questions.”

AI and sustainability
Where will the all power come from? Do the engineering teams that are writing the training algorithms need to start being more power-aware?

“Sustainability is something that we have been addressing in the semiconductor industry for 20 years,” said Rich Goldman, director at Ansys. “There’s been awareness that we need low-power designs, and software to enable low-power designs. Today, it comes down to an issue of engineering ethics and morality. Do our customers care about it when they buy a chip or when they buy a training model? I don’t think they make their decisions based on that.”

What also comes into play is how engineers are rewarded, evaluated, and assessed. “Commitment to sustainability is typically not included on what they must put into the product, so they aren’t motivated, except by their own internal ethics and the company’s ethics towards that. It’s the age-old ethics versus dollars in business, and in general we know who wins that. It’s a huge issue. Maybe we should be teaching ethics in engineering in school, because they’re not going to stop making big, powerful LLMs and training on these huge data centers,” Goldman noted.

Still, it’s going to take huge numbers of processors to run AI models. “So you want

Share This Story, Choose Your Platform!

About the author : Matthew Lemma

Leave A Comment

Recent Posts

Contact Info

12345 North Main Street, New York, NY 555555

Phone: 1.800.555.6789

Mobile: 1.800.123.4567

Email: info@company.com

Web: Buy Avada Now!