Example Image
Civitas Outlook
Topic
Economic Dynamism
Published on
Apr 23, 2025
Contributors
Rachel Lomasky
Lynne Kiesling
Large Language Model Training Cluster. (Shutterstock)

Why the AI Revolution Will Require Massive Energy Resources

Contributors
Rachel Lomasky
Rachel Lomasky
Rachel Lomasky
Lynne Kiesling
Lynne Kiesling
Lynne Kiesling
Summary
Training large-scale models requires enormous computation, fueled by multimodal datasets comprising text, audio, video, images, and sensor data that often exceed a petabyte.
Summary
Training large-scale models requires enormous computation, fueled by multimodal datasets comprising text, audio, video, images, and sensor data that often exceed a petabyte.
Listen to this article

The rapid rise of generative AI has triggered a sharp escalation in data center electricity consumption, with profound implications for national energy use, system planning, and climate goals. Data centers have long been critical infrastructure for digital services, but their energy demand is now accelerating due to the emergence of compute-intensive AI workloads.

Data center electricity use began climbing after plateauing around 60 terawatt-hours (TWh) annually from 2014 to 2016—roughly 1.5 percent of total U.S. electricity consumption. By 2018, it had reached 76 TWh (1.9 percent of national demand), driven by growing server installations and an architectural shift toward AI-optimized hardware, particularly Graphics Processing Units (GPUs). This upward trend has since intensified. By 2023, U.S. data center electricity consumption had surged to an estimated 176 TWh, representing 4.4 percent of total U.S. demand, roughly equivalent to the annual electricity use of the entire state of New York.

This growth shows no signs of slowing. The U.S. Department of Energy projects that by 2028, annual electricity demand from data centers could reach between 325 TWh and 580 TWh, or 6.7 percent to 12 percent of projected national consumption. Forecasts from firms such as Boston Consulting Group and S&P Global similarly place 2030 data center electricity use between 5 percent and 10 percent of total U.S. demand. The range of estimates reflects uncertainty in how quickly AI technologies will be adopted and how widely compute-intensive applications will scale.

At the heart of this demand surge is generative AI. Training large-scale models requires enormous computation, fueled by multimodal datasets comprising text, audio, video, images, and sensor data that often exceed a petabyte. While data acquisition and storage carry energy costs, the training process itself is far more energy-intensive, as it depends on the model's size, the complexity of its architecture, and the degree of refinement. Training is a one-time event per model but demands vast amounts of power, time, and hardware resources.

After training, models are used for inference, generating outputs in response to user queries. Each inference consumes far less energy than training, but because these systems are queried millions of times daily, their cumulative energy use becomes substantial. More complex outputs, such as videos or high-resolution images, increase the burden.

Generative AI workloads depend heavily on specialized chip architecture: (GPUs) and Tensor Processing Units (TPUs). These chips are optimized for the matrix operations at the core of AI computation. While they are more efficient than general-purpose CPUs for such tasks, they also draw significantly more power and generate more heat. As a result, they require constant and often intensive cooling, which in turn demands additional electricity and, in many cases, fresh water. Marginal improvements in chip design, such as more compact transistor layouts and power-aware software, have improved performance per watt. Similarly, advances in cooling that range from more efficient fans and heatsinks to liquid cooling and immersion systems help reduce waste heat. However, these innovations have not yet offset the exponential growth in demand.

One promising way to mitigate energy use is to reduce the computational intensity of the algorithms themselves. Smaller, specialized models can be trained with less data, lower numerical precision, and fewer iterations, making them faster and less costly. Techniques like transfer learning, where a pre-trained model is adapted for a new task, and federated learning, where training is distributed across edge devices rather than centralized, can also conserve energy and reduce data transfer loads.

Still, overall energy demand continues to rise—a textbook example of the Jevons Paradox, where efficiency gains lower costs but stimulate greater total consumption. Yet generative AI may also produce net energy savings in other sectors. For example, dynamic routing algorithms can optimize delivery truck routes based on real-time traffic and weather data, reducing fuel use. Similar gains are possible in building HVAC control, precision agriculture, and industrial automation. Thus, while AI’s direct energy footprint is growing rapidly, its broader potential to improve energy efficiency while increasing economic productivity may partially offset these impacts.

Lynne Kiesling is Director of the Institute for Regulatory Law & Economics at Northwestern Pritzker's Center on Law, Business, and Economics; Research Professor at the University of Colorado, Denver.

Rachel Lomasky is Chief Data Scientist at Flux, a company that helps organizations do responsible AI.

10:13
1x
10:13
More articles

Painting the Revolution

Politics
Jul 11, 2025

How Big, Bad, or Beautiful?

Politics
Jul 10, 2025
View all

Join the newsletter

Receive new publications, news, and updates from the Civitas Institute.

Sign up
More on

Economic Dynamism

Automated Detection of Emotion in Central Bank Communication: A Warning

Can LLMs help us better understand the role of emotion in central bank communication?

Carola Binder, Nicole Baerg
Economic Dynamism
Jul 1, 2025
Defending Technological Dynamism & the Freedom to Innovate in the Age of AI

Human flourishing, economic growth, and geopolitical resilience requires innovation—especially in artificial intelligence.

Adam Thierer
Economic Dynamism
Jun 6, 2025
Partisan Trust in the Federal Reserve

This paper examines partisanship in public perceptions of the Federal Reserve.

Carola Binder, Cody Couture, Abhiprerna Smit
Economic Dynamism
Apr 22, 2025
The American Dream Is Not a Coin Flip, and Wages Have Not Stagnated

This paper challenges the prevailing narrative that stagnant wages are causing the American dream to fade. It contrasts subjective public opinion with revised objective intergenerational mobility measures.

Scott Winship
Economic Dynamism
Mar 6, 2025
No items found.
Firm Investment and the User Cost of Capital: New U.S. Corporate Tax Reform Evidence

This working paper analyzes the Tax Cuts and Jobs Act of 2017 to examine how corporate tax reform impacts rates of investment.

Jonathan S. Hartley, Kevin A. Hassett, Joshua D. Rauh
Economic Dynamism
Jul 7, 2025
It’s Not Easy, but We Can All Learn to Think like Adam Smith

To truly understand what a dynamic economy requires, we would do well to recover the 18th-century sensibility that understood dynamism as a social and cultural phenomenon as much as an economic one.

Ryan Streeter
Economic Dynamism
Jun 18, 2025
Locked Out of the Dream: Regulation Making Homes Unaffordable Around the World

The first in a two-part series on the global housing crisis.

Joel Kotkin
Economic Dynamism
Jun 5, 2025
A Most Favored National Policy for Pharma Drugs Makes Sense

"Most favored nation" pricing can help make America's pharmaceutical marketplace more competitive.

Dirk Mateer
Economic Dynamism
May 19, 2025

Richard Epstein: Law and Economics of Public Sector Unions

Economic Dynamism
Jun 19, 2025
1:05

Can the U.S. Defense Industrial Base Meet Today’s Challenges?

Economic Dynamism
May 13, 2025
1:05

Virginia Postrel and Adam Thierer on Big Trends and Big Ideas in Dynamism

Economic Dynamism
Apr 29, 2025
1:05

Joel Mokyr on American Dynamism vs. Techno-pessimism

Economic Dynamism
Apr 29, 2025
1:05

Arthur Herman on Mobility, Markets, and Natural Law

Economic Dynamism
Apr 29, 2025
1:05
No items found.
No items found.
AI on the Brain

A new study predicts a "likely decrease in learning skills, as LLMs reduce the cognitive effort required for writing," translating to poorer long-term knowledge retention.

Rachel Lomasky
Economic Dynamism
Jul 10, 2025
Escape From New York

With Mamdani rising in New York, Texas can now offer New York’s leading businesses a package deal: relocate to Texas and list on the TXSE. Call it the Texas Two-Step.

Michael Toth
Economic Dynamism
Jul 7, 2025
Exploring the Future of Latin America and US Relations

The authors examine the promises and perils of the Trump administration and the new governments in Latin America, and look forward to a renewed dialogue with U.S.-based scholars at this moment of opportunity.

Oscar Sumar, John Yoo
Economic Dynamism
Jul 2, 2025
A New Consensus in Latin America

During Biden’s administration, U.S. policy appeared passive, even compromised, in the face of these revolutionary developments.

Oscar Sumar
Economic Dynamism
Jul 2, 2025
No items found.