Skip to content
It's electric, boogie-oogie-oogie

Taking a closer look at AI’s supposed energy apocalypse

AI is just one small part of data centers’ soaring energy use.

Kyle Orland
Someone just asked what it would look like if their girlfriend was a Smurf. Better add another rack of servers! Credit: Getty Images
Someone just asked what it would look like if their girlfriend was a Smurf. Better add another rack of servers! Credit: Getty Images

Late last week, both Bloomberg and The Washington Post published stories focused on the ostensibly disastrous impact artificial intelligence is having on the power grid and on efforts to collectively reduce our use of fossil fuels. The high-profile pieces lean heavily on recent projections from Goldman Sachs and the International Energy Agency (IEA) to cast AI's "insatiable" demand for energy as an almost apocalyptic threat to our power infrastructure. The Post piece even cites anonymous "some [people]" in reporting that "some worry whether there will be enough electricity to meet [the power demands] from any source."

Digging into the best available numbers and projections available, though, it's hard to see AI's current and near-future environmental impact in such a dire light. While generative AI models and tools can and will use a significant amount of energy, we shouldn't conflate AI energy usage with the larger and largely pre-existing energy usage of "data centers" as a whole. And just like any technology, whether that AI energy use is worthwhile depends largely on your wider opinion of the value of generative AI in the first place.

Not all data centers

While the headline focus of both Bloomberg and The Washington Post's recent pieces is on artificial intelligence, the actual numbers and projections cited in both pieces overwhelmingly focus on the energy used by Internet "data centers" as a whole. Long before generative AI became the current Silicon Valley buzzword, those data centers were already growing immensely in size and energy usage, powering everything from Amazon Web Services servers to online gaming services, Zoom video calls, and cloud storage and retrieval for billions of documents and photos, to name just a few of the more common uses.

The Post story acknowledges that these "nondescript warehouses packed with racks of servers that power the modern Internet have been around for decades." But in the very next sentence, the Post asserts that, today, data center energy use "is soaring because of AI." Bloomberg asks one source directly "why data centers were suddenly sucking up so much power" and gets back a blunt answer: "It’s AI... It’s 10 to 15 times the amount of electricity."

The massive growth in data center power usage mostly predates the current mania for generative AI (red 2022 line added by Ars).
The massive growth in data center power usage mostly predates the current mania for generative AI (red 2022 line added by Ars). Credit: Bloomberg

Unfortunately for Bloomberg, that quote is followed almost immediately by a chart that heavily undercuts the AI alarmism. That chart shows worldwide data center energy usage growing at a remarkably steady pace from about 100 TWh in 2012 to around 350 TWh in 2024. The vast majority of that energy usage growth came before 2022, when the launch of tools like Dall-E and ChatGPT largely set off the industry's current mania for generative AI. If you squint at Bloomberg's graph, you can almost see the growth in energy usage slowing down a bit since that momentous year for generative AI.

Determining precisely how much of that data center energy use is taken up specifically by generative AI is a difficult task, but Dutch researcher Alex de Vries found a clever way to get an estimate. In his study "The growing energy footprint of artificial intelligence," de Vries starts with estimates that Nvidia's specialized chips are responsible for about 95 percent of the market for generative AI calculations. He then uses Nvidia's projected production of 1.5 million AI servers in 2027—and the projected power usage for those servers—to estimate that the AI sector as a whole could use up anywhere from 85 to 134 TWh of power in just a few years.

To be sure, that is an immense amount of power, representing about 0.5 percent of projected electricity demand for the entire world (and an even greater ratio in the local energy mix for some common data center locations). But measured against other common worldwide uses of electricity, it's not representative of a mind-boggling energy hog. A 2018 study estimated that PC gaming as a whole accounted for 75 TWh of electricity use per year, to pick just one common human activity that's on the same general energy scale (and that's without console or mobile gamers included).

Worldwide projections for AI energy use in 2027 are on the same scale as the energy used by PC gamers.
Worldwide projections for AI energy use in 2027 are on the same scale as the energy used by PC gamers. Credit: Digital Storm

More to the point, de Vries' AI energy estimates are only a small fraction of the 620 to 1,050 TWh that data centers as a whole are projected to use by 2026, according to the IEA's recent report. The vast majority of all that data center power will still be going to more mundane Internet infrastructure that we all take for granted (and which is not nearly as sexy of a headline bogeyman as "AI").

It's true, though, that generative AI is projected to grow much faster than many other portions of the "data center" economy. De Vries' estimates suggest that generative AI power usage will grow between 15 to 23 times higher between 2023 and 2027. Projecting that same growth rate even a few more years into the future gets you to some truly eye-watering numbers, such as Arm CEO Rene Haas' recent estimate that AI could be responsible for "20 to 25 percent of US power requirements" by 2030 (that would be about 1,000 TWh a year, based on current US energy usage).

It's easy to look at the current Silicon Valley mania for anything with "AI" in its pitch deck and assume that such exponential growth is inevitable well into the 2030s. But a lot can change in six years, especially when it comes to technology that is still evolving and struggling to prove itself. To maintain the blistering growth rates inherent in these projections, AI will need to start showing some equally stunning economic results before too long.

What’s it worth to you?

When it comes to measuring energy use, it's important to measure not just how much energy is being used but what you are getting in exchange for that energy. As the world struggles to transition away from fossil fuels, even a "reasonable" amount of electricity dedicated to generative AI might not be worth the potential climate effects of that energy use.

Appliances like refrigerators and air conditioners, for instance, are immense power hogs that take up an estimated 17 percent and 20 percent of worldwide electricity demand, respectively. But keeping food fresh and humans comfortable in increasingly dangerous summer heat are widely seen as necessary and good uses for that energy, so plugging in a refrigerator or window unit is not that controversial (even as government and environmental groups push for increased efficiency). Even gaming PCs are typically seen as valid, if relatively energy-intensive, sources of entertainment in the Western world.

That's not the case with another recent major energy hog: cryptocurrency. The IEA estimates crypto mining ate up 110 TWh of electricity in 2022, which is right in line with de Vries' projection for AI's energy use in 2027.

When it comes to burning computational energy to no real purpose, AI's got nothing on crypto mining.
When it comes to burning computational energy to no real purpose, AI's got nothing on crypto mining. Credit: Getty Images

As someone who's skeptical that cryptocurrency has much value beyond speculative asset gambling, it's easy for me to see its usage as a significant and complete waste of our limited energy resources. And for those opposed to generative AI on principled or functional grounds, putting similar energy into millions of Nvidia AI servers probably seems like just as big of an energy waste.

The important thing to remember, though, is that there are economic limits involved in the total energy use for this kind of technology. With bitcoin mining, for instance, the total energy usage has jumped up and down over time, tracking pretty closely with the price of bitcoin. When bitcoin is less valuable, miners are less willing to spend a lot of electricity chasing lower potential profits. That's why alarmist headlines about bitcoin using all the world's energy by 2020 never came to pass (to say nothing of efficiency gains in mining hardware).

A similar trend will likely guide the use of generative AI as a whole, with the energy invested in AI servers tracking the economic utility society as a whole sees from the technology.

Justify your existence

Right now, it seems like every venture capital firm in the world is throwing all the money it can at anything with even a hint of an "AI" use case, leading to surging demand for all those energy-hogging AI servers. In the medium to long term, though, AI systems will have to lead to significant revenue and/or productivity gains to justify the investment of continued resources in manpower, servers, and, yes, electricity.

Will AI be able to justify its economic existence on that time scale? That's an extremely open question. Right now, for instance, the IEA estimates that a ChatGPT query takes about 10 times as much energy as a standard Google search. Is an AI-powered answer from Google worth 10 times as much as a normal web search? It's easy to look at early snafus like Google recommending glue on pizza and answer with a clear "No." If an AI system hallucinates incorrect or dangerous information, no amount of energy or cost is low enough to justify the results.

Or take AI image generation, where one estimate suggests creating 1,000 generative AI images takes as much energy as driving about four miles in a car. If those images are as good or better than the ones you'd get more slowly from a human artist, the economic case for that energy usage is obvious (setting the moral case for human artists aside for now). But if the results have too many fingers or accidental nudity or look like horrifying nightmare fuel, then that energy is just wasted.

If you cherry-pick the worst examples of AI screwups, it's easy to see the entire sector as a misguided use of limited money and energy resources. But if you're a programmer who's getting twice as much done with AI coding tools or a customer service manager seeing productivity gains from employees who can consult with AI, that may seem like electricity well spent.

Computation that takes a lot of electricity now may not be so energy-intensive in the future.
Computation that takes a lot of electricity now may not be so energy-intensive in the future. Credit: Roger Ngo

Then there is the very real potential for efficiency gains across the AI sector. Moore's Law may not be quite what it used to be, but chips can still do way more with one watt of energy than they could just a few years ago. Already, we're seeing companies like Apple running many AI requests directly on a relatively low-powered iPhones rather than shipping them off to energy-hogging data centers. Improvements in AI algorithms also have the potential to limit just how much energy is needed for many common generative AI tasks, which could lower overall costs in money and energy significantly.

How much AI is too much AI?

You can argue that tech companies will force AI into every facet of our lives whether we like it or not. It can be easy to assume that right now, when everyone is releasing low- or no-cost AI systems to the public in the hopes of becoming a future market leader. During the current novelty phase for generative AI, no one wants to miss out on what everyone seems to be saying could be the next big thing.

In the end, though, these companies are in the business of making a profit. If customers don't respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs. And consumer AI spending will have to be quite significant indeed to justify current investment—OpenAI projected losses of $1 billion in 2023, even as annualized revenues hit $1.6 billion by the end of the year.

Remember when people thought the metaverse would become a significant chunk of our economy? It wasn't that long ago!
Remember when people thought the metaverse would become a significant chunk of our economy? It wasn't that long ago! Credit: Getty Images

If you're an AI hater, it's frustrating to see what you consider a useless technology growing to take up more and more of our energy mix, eating into climate gains being made from the immense growth of renewable energy. If you're an AI maximalist, on the other hand, the significant energy use projected for AI is a small price to pay for a technology that you think will revolutionize our lives much more than technologies like air conditioning, refrigeration, or the automobile ever did.

The answer probably lies somewhere in the middle. In the long run, AI's energy use will likely level off at a significant but not grid-melting level that's roughly commensurate with the collective economic value we as a society get from it. Whether the trade-offs inherent in that shift are "worth it" involves a lot of value judgements that go well beyond how much electricity a bunch of servers are using.

Listing image: Getty Images

Photo of Kyle Orland
Kyle Orland Senior Gaming Editor
Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.
Staff Picks
richierocks
I speak to a lot of Chief Data Officers and Chief AI Officers as part of my job, and one of the big themes recently has been "wow, generative AI is expensive when you put it in production". So there is a big push towards getting good performance with less GenAI. For example, by fine-tuning smaller models, or just using an LLM as a natural language interface wrapper over a deterministic model. This push for cost efficiency should help reduce the need for energy too.
Most Read
  1. Listing image for first story in Most Read: Helene ravaged the NC plant that makes 60% of the country’s IV fluid supply
    1. Helene ravaged the NC plant that makes 60% of the country’s IV fluid supply
  2. 2. Apple couldn’t tell fake iPhones from real ones, lost $2.5M to scammers
  3. 3. X fails to avoid Australia child safety fine by arguing Twitter doesn’t exist
  4. 4. Neo-Nazis head to encrypted SimpleX Chat app, bail on Telegram
  5. 5. ULA’s second Vulcan rocket lost part of its booster and kept going