This group brings together the best thinkers on energy and climate. Join us for smart, insightful posts and conversations about where the energy industry is and where it is going.

Post

What Clean Energy Supporters Need to Know About AI

image credit: BalticServers.com
Jay Stein's picture
Senior Fellow Emeritus, E Source

Jay Stein, a Senior Fellow Emeritus affiliated with E Source, is one of America's leading energy technologists. Over the course of his over 40-year career he has played numerous roles, including...

  • Member since 2006
  • 101 items added with 73,744 views
  • May 3, 2024
  • 153 views

Like many groundbreaking technologies, artificial intelligence offers exhilarating opportunities and exasperating challenges. On the plus side of the ledger, AI is being used to treat cancer patients, model the impacts of climate mitigation strategies, and identify landmines for disarmament. As for challenges, not only is AI filling up our public square with low-quality information (or misinformation, as the case may be), but it also has a voracious appetite for energy. The potential consequences of AI’s energy consumption include increased carbon dioxide emissions, massive water depletion, and additional stress to the power grid.

Exactly how much electric energy is consumed by AI is currently unknown. AI developers aren’t eager to share such information, but early estimates are concerning. According to one researcher, if AI were a country, by the year 2027, it would be about the world’s 35th largest national electricity consumer. In the US, AI’s rapidly growing energy consumption is coinciding with increased electric load from other sources, including manufacturing and electrification. US utilities are already bracing themselves for nearly twice the load growth they were planning on just a few years ago.

Parts of the US government and the AI industry are well aware of the energy and environmental problems imposed by AI’s growth, and are taking action. In the Senate, a recently introduced bill would enable federal agencies to analyze AI-related environmental problems, and propose solutions. The AI industry is improving the energy efficiency of its hardware, cooling systems, algorithms, and virtually every other aspect of its operations.

In the end, AI energy consumption will probably follow the same trajectory as overall internet-related energy did. Around the turn of the century, analysts predicted that internet energy consumption would grow to levels that would overwhelm the power grid. Instead, data center operators and hardware manufacturers improved efficiency, so that even though internet data usage grew immensely, data centers consume only about one percent of overall global electricity consumption.

Even though AI’s energy consumption isn’t likely to hit catastrophic levels, that doesn’t mean it’s not a problem. Instead, it’s a problem that will likely remain manageable with assiduous efforts by industry, government, utilities, and environmental advocates. Indeed, much of that work is already underway.

An AI primer for energy enthusiasts

AI is a technology that enables computers to simulate human intelligence and perform tasks that, until recently, required human cognition. It works by taking in huge amounts of data, and looking for patterns in order to make inferences and predictions. Early versions of AI were used for spell checking, filtering out spam email messages, and offering product recommendations.

A major turning point for AI came with the release of ChatGPT in 2022, which was capable of engaging in human-like conversations and generating content. Its users interact with it by typing in prompts that describe their desired responses. Based on those prompts, ChatGPT can write poetry, compose music, generate computer code, and write essays. Since then, more advanced AIs have emerged that can generate images and video.

AI is essentially “manufactured” by equipment that can be sorted into three different levels: chips, servers, and data centers. At the most basic level, AI’s data and algorithms are processed in specialized computer chips that perform trillions of operations per second. Those chips are incorporated into servers, essentially industrial strength computers that are the workhorses of the internet. They combine data processing and storage, conduct the right voltage to those components, and network them with other servers and storage devices. Servers are housed in data centers, which contain backup power and electric infrastructure to provide virtually outage-proof electricity, cooling equipment to ensure that electronic equipment doesn’t overheat, and cabling to physically connect servers to the internet.

AI consumes a lot of energy

We can’t say with certainty how much energy AI consumes or how fast that consumption is growing. We do know that, according to the International Energy Agency, the combined electricity consumption of the major companies developing AI, Amazon, Microsoft, Google, and Meta, more than doubled between 2017 and 2021. It’s unknown how much of that energy went to AI.

Currently, the best estimate we have of global AI energy consumption comes to us from Alex de Vries, a PhD candidate at the VU Amsterdam School of Business and Economics, and the founder of Digiconomist, a research company dedicated to exposing the unintended consequences of digital trends. De Vries observed that about 95% of the servers used in AI applications come from a single company, Nvidia. This company, based in Santa Clara, CA, is best known for designing and supplying graphics processing unit chips, as well as servers incorporating them. Although GPUs were originally designed to process images for personal computers, cell phones, and game consoles, their ability to do parallel calculations made them the chip of choice for AI systems. On the strength of its GPUs, in just 30 years, Nvidia went from founding to being the world’s third-most valuable company, with a market capitalization over $2 trillion.

Nvidia publishes the energy specifications for its AI servers as well as sales projections. De Vries took this information and calculated that by the year 2027, the AI sector could consume anywhere from 85 to 134 terawatt-hours per year. By coincidence, that’s about the same amount of electricity de Vries’ home country, the Netherlands, consumes. That estimate doesn’t account for the likelihood that those Nvidia servers don’t run at 100% capacity all the time. Servers running at partial capacity don’t consume as much electricity as fully loaded ones, but de Vries’ estimate also doesn’t include several other avenues of AI energy consumption, including cooling, memory, and network equipment. That additional energy consumption probably more than offsets any error introduced by not accounting for capacity.

In the US, AI’s rising consumption is coinciding with several other growing electrical loads, including data center expansion driven by growing internet usage, new industrial facilities incentivized by the Inflation Reduction Act, and building and transportation electrification. Just a few years ago, US utilities planned for 2.6% load growth over the next five years. Now, they’re forecasting 4.7% growth over the same time period. They’re struggling to plan and build to meet that near doubling of expected new load without ramping up carbon dioxide emissions. Already several utilities in the Southeast are looking to build a massive amount of new natural gas generating plants, and unmet demand for more capacity in the nation’s transmission system is reaching crisis proportions. To be clear, AI isn’t the sole cause of these problems, but it’s certainly contributing to them.

First comes training, then AI can do its work

AI consumes energy in two phases: training and inquiry. Training comes first, during which massive amounts of data are fed into the model. That data often comes from the internet, newspapers, books, and pretty much any other source developers can get their hands on. The AI model then analyzes training data, and identifies trends. GPT-3, the large language model underlying ChatGPT, was trained on 45 terabytes of text data. According to McKinsey, the multinational management consulting company, “that’s about one million feet of bookshelf space, or a quarter of the entire Library of Congress.”

Once a model is trained, it can then be used for inquiries (also known as inferences). That’s the phase most of us are familiar with. It’s when we submit a prompt to ChatGPT, like “draw a picture of cat astronauts piloting the lunar lander,” and get a response. While it’s widely believed that training consumes more energy than inquiry, that’s not necessarily true. The training process does consume far more energy than any single inquiry, but if a model gets enough inquiries, the entire inquiry phase can consume more.

Returning to GPT-3, training it consumed nearly 1300 megawatt-hours, which is enough energy to power 120 US households for a year. Consider that ChatGPT processes about 10 million inquiries per day, and it’s one of only hundreds of applications based on GPT. According to Alex de Vries, ChatGPT could consume as much as 564 megawatt-hours per day. At that rate, GPT-3’s inquiry energy consumption would exceed its training consumption in just a few days.

How much energy does an individual inquiry consume?

AI inquiry energy consumption is not well understood, and developers release little information about it. Much more research has been done on training energy. Also, inquiry energy consumption depends on a wide variety of factors, including type of task, length of response, and whether text or images are being processed. Lastly, there are no standards for defining and measuring inquiry energy consumption, so users have no way to compare findings released by different researchers.

The only research study I’m aware of so far that measured the energy consumption of specific AI tasks was done by a team of researchers, most of whom are affiliated with Hugging Face, a company that enables AI experts and enthusiasts to develop, train, and deploy open source models. This research team defined a set of tasks, ran those tasks on a variety of AI models, and measured the energy consumed. By doing so, they came up with some valuable insights:

Generating images consumes more energy than generating text. The Hugging Face researchers found that generating text, on average, consumed enough electric energy to keep a 9W LED bulb lit for 19 seconds. They found that generating an image, on average, consumed enough energy to light that same bulb for 19 minutes, or 60 times as much as text. Presumably, generating video would consume much more energy.

Multipurpose models consume more energy than specific task models. The more things a model is capable of doing, the more energy it consumes. The best known multipurpose model is probably ChatGPT, whose extensive capabilities were described above. A good example of a single task model is Deepbeat, an AI rap lyrics generator. Users key in a few opening lyrics, and the model generates the rest.

While the work done by the Hugging Face researchers is an important step forward, it’s going to take a lot more research, and standards development, to enable AI users to have an accurate account of how much energy they’re consuming.

What the US Federal Government is doing

The AI Environmental Impacts Act of 2024 (S.3732) was recently introduced in the US Senate by Ed Markey, of Massachusetts. It would require the US EPA and other federal agencies to study the environmental consequences associated with AI, including energy and water consumption, and make recommendations for additional legislation. It requires the National Institute of Standards and Technology to set standards for quantifying AI’s environmental impacts, as well as a voluntary reporting system for AI developers and operators. The data collected by the reporting system would then form the basis for a report to Congress, as well as recommendations for remedial legislation. If this bill passes, it would provide an important foundation for future efforts to mitigate AI’s contributions to climate change. Although passage seems unlikely in the current Congress, that may change after the election.

What the AI and data center industries are doing

Because the AI industry is so entwined with the data center industry, and energy is the single largest cost of running a data center, both industries have been attentive to efficiency and carbon emissions largely since their inception. Here are some of the actions they’re taking.

Developing more efficient hardware

Nvidia, the company that supplies most of the chips for AI models, focused on energy efficiency right from its founding. Indeed, it was in large part, because Nvidia’s GPUs were so efficient at parallel processing that AI became feasible for widespread use. In March, Nvidia announced that its next generation GPU chips would be faster and more energy efficient than its current ones. The company expects that its new chips will cut the energy consumption of training a large AI model by nearly 75%.

Several chip manufacturers are also developing new energy efficient AI products. IBM, and other chip researchers, are focusing on overcoming the most energy intensive AI chip process: moving information between the computing and memory sections of the chips. To reduce that energy consumption, they’re finding ways to integrate computing and memory, so it’s not necessary to move information between them. IBM claims its new chips will be about five times as energy efficient as Nvidia’s current GPUs.

Implementing more efficient cooling systems

Virtually all the energy used to power AI chips, and other related hardware, ultimately turns into heat. Most data centers blow air over the chips and other electronic components to remove this heat, and have air conditioning systems to cool the air down and reject heat outdoors. Cooling systems account for about 40% of data center energy consumption. Also, because these systems frequently use evaporation to help reject heat, they consume a lot of water.

One reason air-based cooling systems consume so much energy is that they use a lot of noisy fans. Data center operators can keep their chips cooler, with less energy and noise, by using liquid-based cooling. In one kind of liquid cooling system, the servers are completely submerged in a special non-electric-conducting liquid. The servers heat the liquid which is then pumped to equipment that cools the liquid and transfers heat outdoors. Because it’s more efficient to move liquids than air, and cooler chips consume less energy than hotter chips, implementing liquid cooling can reduce overall data center energy consumption by about 10%.

Recovering data center waste heat

Instead of simply dissipating the heat removed by cooling systems into the atmosphere, some data centers circulate that heat to nearby buildings and industrial facilities. The Air, a data center located in Helsinki, uses heat absorbed from its electronic equipment to heat water to nearly 90℉, and heat pumps boost that temperature to nearly 200℉. That high-temperature heat is then channeled into Helsinki’s district heating and cooling system, which supplies both homes and commercial buildings. The Air provides 1.3 MW of thermal power to the system, which, over the course of a year, is enough to meet the heating requirements of over a thousand average US households.

Using AI to reduce data center energy consumption

Researchers working at Google’s DeepMind laboratory trained a machine learning model using data from thousands of sensors in Google’s data centers. The training data included temperatures, power meter readings, pump speeds, and setpoints. The researchers then trained the model to predict future operating conditions and reduce cooling system energy consumption. By adjusting setpoints to anticipate actual server processing loads, the Google researchers found that they could reduce cooling system energy consumption by 40%. Meta and Microsoft also announced that they were using AI to improve data center energy efficiency, but provided little detail on how or the results achieved.

Using carbon-free electricity

Apple, Google, Meta, Microsoft, Amazon, and numerous other less-recognizable data center companies, all claim to already use 100% renewable energy, or expect to do so soon. These companies typically buy some combination of renewable energy contracts and credits to offset their consumption on an annual basis. As laudable as their actions are, it’s not clear that they can claim their facilities don’t at all contribute to climate change. Because renewables are intermittent, there are numerous hours over the course of a year in which there isn’t enough renewable energy available for these companies to offset their real-time consumption. The power they draw from the grid during those hours is unlikely to be carbon-free.

In response to this concern, Google and Microsoft are upping their game and pledging to achieve 24/7 carbon-free energy use by the year 2030. The details of their commitments are spelled out in the 24/7 Carbon-Free Energy Compact, which says they will meet “every kilowatt-hour of electricity demand with Carbon-Free Energy sources, every hour, every day, everywhere.” To do so, they anticipate they’re going to need some breakthrough technologies that are only in their infancy today, including advanced nuclear, geothermal, clean hydrogen, and long-duration energy storage. Indeed, they’ve formed a partnership to develop such technologies.

The lessons of history

Will the actions taken by government and industry, some of which are described above, and many more which are not, be enough to tame the AI energy consumption beast? Recent history suggests that the AI industry and utilities will somehow manage to muddle through.

In 1999, two analysts working for the Greening Earth Society, a now-defunct public relations organization funded by coal companies, claimed that the Internet accounted for 8% of US electricity consumption. Furthermore, they predicted that by the end of the following decade, energy consumed to manufacture, use, and network computers would account for about half the output of the electric grid. These claims caused quite a ruckus, which led a group of scientists at Lawrence Berkeley National Laboratory to conduct a detailed study of US office, telecommunications, and network energy consumption. The LBNL scientists concluded that all this equipment combined was only consuming about 3% of US electricity.

Over the intervening years, LBNL updated this work a few times. The most recent update, published in 2016, concluded that in 2014 data centers accounted for 1.8% of total U.S. electricity consumption. Although LBNL hasn’t published any subsequent updates, a recent report from the International Energy Agency concluded that over the time period 2015 to 2022, worldwide data center workloads went up 340%, but that data center energy use only went up 20% to 70%. The IEA pegged global data center electricity consumption at 1-1.3% of global supply.

Why didn’t data center electricity consumption rise in lockstep with Internet workloads? The energy efficiency of their electronic equipment improved, as well as the effectiveness of their cooling systems, and the industry shifted from many small data centers, to fewer huge, but far more efficient facilities.

If you project that AI use continues to grow at current rates, and assume that the equipment that produces it continues to operate at current efficiencies, any calculation based on those assumptions will predict catastrophe. We know from historical precedents that’s unlikely to happen. Instead, what’s more likely is that the industry will improve both its energy efficiency and its ability to procure carbon-free electricity, and will keep AI’s energy consumption at manageable levels.

That doesn’t mean that the AI energy consumption problem is solved and that the industry can just be left to handle things on its own. There will still be a need for public policies like the Markey bill, which would enable everyone to better understand the problem, and for industry to identify best practices. There will also still be a need for public-private partnerships, like the RE100, which succeeded at getting many data center companies to commit to using 100% renewable electricity. And lastly, we’ll still need public policies that simultaneously clean up the grid and expand its capacity, so that when the expected wave of rapid load growth that is coming from some combination of AI, industrial expansion, and building and transportation electrification, utilities will be able to cleanly serve those loads.

Disclaimer

No AIs were used, nor were any harmed, in the writing of this post.

And now, a word from our sponsor

This post originally appeared on the Energy Technology Revolution website. Did you learn something valuable from reading it? If so, you’re exactly the sort of person who would enjoy subscribing to Energy Technology Revolution. All you need do is click here and fill in a bit of information. We’ll send you an email every so often when there’s a new ETR post. It’s free, it’s easy, and it’s fun.

You might also enjoy one of the last three preceding ETR posts:

Lastly, here’s the previous time I wrote about the information technology industry:

Discussions
Matt Chester's picture
Matt Chester on May 3, 2024

Would asking AI to help us with how to handle AI's power requirements break one of Asimov's laws of robotics? 

Jay Stein's picture
Jay Stein on May 6, 2024

Matt, I’m an energy geek, not a robotics geek, so I’m working outside my field of expertise here. That said, I don’t think asking AI to figure out how to reduce AI's power requirements would violate any of Asimov's 3 laws of robotics. Reducing AI’s energy consumption would help mitigate climate change, and that would protect human health. And reducing energy consumption would enable more AIs to process more inquiries, protecting robot existence. Than again, maybe you see something here that I don’t.

Matt Chester's picture
Matt Chester on May 6, 2024

I was speaking more tongue in cheek-- I'm always looking for the sci fi angle :) Thanks for responding in any event!

Jay Stein's picture
Thank Jay for the Post!
Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.
More posts from this member

Get Published - Build a Following

The Energy Central Power Industry Network® is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »