AI may eventually consume a quarter of America's power by 2030, warns Arm CEO
An IEA Electricity 2024 report warns that ChatGPT requests costs nearly 10 times more than a Google search in power
In an interview cited by The Wall Street Journal earlier this week, Rene Has, CEO of Arm, warned of AI's "insatiable" thirst for electricity, stating an increase to as much as 25% of the U.S.' current 4% power grid usage from AI datacenters is possible.
Rene himself may have been citing an International Energy Agency report from January stating that ChatGPT consumes roughly 2.9 watt-hours of electricity per request, which is 10 times as much as a standard Google search. Thus, if Google made the full hardware and software switch with its search engine, Google would consume at least 11 terawatt-hours of electricity per year from its current 1 TWh.
The original report says one example of a standard 2.9-watt-hour would be running a 60-watt-hour lightbulb for just under three minutes. Similar to the standard deviation of ChatGPT queries to standard search engines, industry-wide expectations for Artificial Intelligence power demands are expected to increase tenfold.
These statements were made ahead of an expected U.S. and Japanese partnership in AI and alongside recent developments like OpenAI's Sora, the current version of which Factorial Funds estimates to consume at least one Nvidia H100 GPU per hour to generate five minutes of video. Grok 3 has also been estimated to require 100,000 Nvidia H100s just for training. A single, 700-watt Nvidia H100 can consume roughly 3740 kilowatt-hours per year.
Without great improvements to efficiency and/or greatly increased government regulation, Rene declares the current trend is "hardly very sustainable," and he might be correct.
The US Energy Information Administration (EIA) stated that the United States generated a total of 4.24 trillion kilowatt-hours, or 4240 terawatt-hours, in 2022, with only 22% of that coming from renewables. This is compared to a total consumption of 3.9 trillion kWh, or 3900 terawatt-hours of the available ~42.
That's 11 of the 340 remaining terawatt-hours left at current levels that the AI industry seems to be aiming for in the next decade. Sustainability must also keep in mind the likely increasing demands of other industries and the scale of renewable to non-renewable resources. Given that the cost of power has nearly doubled since 1990 (per Statista), perhaps calls for more regulation are justified.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Of course, outlets like The New York Times are also outright suing OpenAI and Microsoft, so it's not like the current AI industry is without existing legal challenges. Rene Haas expressed hope that the international partnership between Japan and the U.S. may yet improve these dramatically high power estimations. However, corporate greed and compute demand are also international, so only time will tell.
Christopher Harper has been a successful freelance tech writer specializing in PC hardware and gaming since 2015, and ghostwrote for various B2B clients in High School before that. Outside of work, Christopher is best known to friends and rivals as an active competitive player in various eSports (particularly fighting games and arena shooters) and a purveyor of music ranging from Jimi Hendrix to Killer Mike to the Sonic Adventure 2 soundtrack.
-
ezst036 It's the quarter of jobs that AI will eventually consume that occupies my thoughts.Reply
Not the power generation. When we start having brown outs, they'll build more nuke stations. People won't put up with the brown outs. That's simple.
It's the job losses. The job losses. -
Eximo What ever will those poor typewriter mechanics do?Reply
People will find new work if they have to. Smart companies will retrain personnel to expand their business.
My only concern would be not having the safety net of employee insurance in the US during any transitional period. With universal healthcare one of the big reasons to not start a small business would go away.
And the eventual path that seems most logical is always basic living stipends. If you displace too much of the workforce the economy will crash. So basically that means higher tax rates for companies that utilize AI to offset the lack of income taxes. As long as the businesses are still making more off of AI then they would a human worker they should be happy. -
brandonjclark "An IEA Electricity 2024 report warns that ChatGPT requests costs nearly 10 times more than a Google search in power"Reply
And ten times more useful than a trashy, ad-steered Google search. -
sadsteve
Kind of why I stopped using Google search years ago. The bias and adds were to annoying to put up with.brandonjclark said:"An IEA Electricity 2024 report warns that ChatGPT requests costs nearly 10 times more than a Google search in power"
And ten times more useful than a trashy, ad-steered Google search. -
rkmcquillen "Without great improvements to efficiency and/or greatly increased government regulation, Rene declares the current trend is "hardly very sustainable," and he might be correct."Reply
-- He is right. And there is a disclaimer in the article. But because of the disclaimer, I am not concerned about the trend.
We've got a new "NPU" chip race. If the NPU's double in speed every year like the similar trendline for GPUs, max potential 2^8=256x efficiency improvement over the next decade.... then there is nothing to worry about. Power consumption could even decrease from what it is now.
****
They say microsoft is currently losing money with the current AI model... that will self-correct over time... but is also a reason the trend may not continue at the current rate of exponential growth. -
Eximo
That doesn't quite equate to web searches though. They would not be locally run. No client can handle that much data. AI model training will also still be done at the data center. It will only be fixed function AI being run on the local machine.rkmcquillen said:"Without great improvements to efficiency and/or greatly increased government regulation, Rene declares the current trend is "hardly very sustainable," and he might be correct."
-- He is right. And there is a disclaimer in the article. But because of the disclaimer, I am not concerned about the trend.
We've got a new "NPU" chip race. If the NPU's double in speed every year like the similar trendline for GPUs, max potential 2^8=256x efficiency improvement over the next decade.... then there is nothing to worry about. Power consumption could even decrease from what it is now.
****
They say microsoft is currently losing money with the current AI model... that will self-correct over time... but is also a reason the trend may not continue at the current rate of exponential growth. -
vanadiel007 I am sure AI will be able to help us develop free and safe sources of energy, so we mortals can enjoy our days at the beach while AI does all the work and creates all our goods.Reply
It will be a bright future, right? -
jp7189 I have a hard believing ChatGPT is 10x more power a Google search. Maybe just the actual search result, but then I usually open 3-5 tabs from the results. I feel those webservers, plus all the ad servers they call, likely add up to more power than a GPT result.Reply -
jp7189
Well.. with the ever increasing filters on ChatGPT, I worry they are steering results too much.sadsteve said:Kind of why I stopped using Google search years ago. The bias and adds were to annoying to put up with.