AI Viewer
Goldman Sachs & IEA · 2026 March 12, 2026 4 min read

AI and Climate: The Energy Cost of Intelligence

Goldman Sachs and the IEA forecast that AI data centers will drive a 165% surge in power demand by 2030, challenging global energy transition goals.

Key Insights

  • Goldman Sachs forecasts a 165% surge in global data center power demand by 2030, driven almost entirely by AI.
  • The IEA estimates data centers will account for up to 3% of total global electricity demand by 2030.
  • Hyperscale AI data centers require over 100 megawatts, equivalent to the annual electricity usage of 100,000 households.

The generative AI revolution is inherently constrained by physics. Training and running massive models like GPT-5.4 or Midjourney requires unprecedented computational power. As we move deeper into 2026, the tech industry’s insatiable demand for electricity is colliding head-on with global climate goals.

How Big is the AI Power Surge?

According to a comprehensive 2025-2026 forecast by Goldman Sachs, AI will drive a 165% surge in global data center power demand by 2030 compared to 2023 levels.

Historically, data centers accounted for roughly 1-1.5% of global electricity consumption. However, the International Energy Agency (IEA) estimates this will double to approximately 3% by 2030, reaching roughly 945 terawatt-hours (TWh) annually. To put this in perspective, that is roughly equivalent to the total current electricity consumption of Japan.

The architecture is fundamentally changing. A conventional data center draws between 10 to 25 megawatts (MW). In contrast, modern “hyperscale” AI data centers frequently exceed 100 MW—consuming the same amount of power as 100,000 households.

Where Will the Power Come From?

Tech giants are scrambling to secure power purchase agreements. While companies have pledged to use 100% renewable energy, the reality of grid economics makes this difficult. Goldman Sachs projects that about 60% of new data center capacity will need to come from new power sources.

These sources are projected to be roughly:

  • 30% natural gas combined cycle turbines
  • 30% natural gas peakers
  • 27.5% solar
  • 12.5% wind

Because solar and wind are intermittent, and AI data centers require constant “baseload” power, the industry is increasingly leaning on natural gas to fill the gaps. Consequently, major tech companies are also aggressively exploring investments in nuclear energy to secure carbon-free baseload power for the 2030s.

Is AI Doing Anything to Help the Climate?

Yes. While the training cost is high, AI is uniquely suited to solve complex climate modeling and optimization problems. AI is actively being deployed to optimize smart grids, significantly accelerate the discovery of novel battery materials, and predict extreme weather events with unprecedented accuracy. The long-term calculation is whether the technological solutions AI discovers will offset the massive carbon footprint required to run it.

Frequently Asked Questions

Does asking ChatGPT a question use a lot of energy?

A single query to an AI model like ChatGPT uses roughly 10 times the electricity of a standard Google search. When multiplied by hundreds of millions of daily users, the aggregate energy consumption is massive.

Why do AI data centers need so much power?

Generative AI relies on specialized chips called GPUs (Graphics Processing Units) running in massive clusters. These chips draw significantly more power than traditional CPUs and require equally massive, energy-intensive liquid cooling systems to prevent them from melting.

Are tech companies increasing carbon emissions?

Yes. Despite ambitious “Net Zero” pledges, several major tech companies reported sharp increases in their total carbon emissions over the last two years, driven almost entirely by the build-out of new AI data centers.

Will the energy grid be able to handle AI?

Grid operators in regions with heavy data center concentration (like Northern Virginia and Ireland) are already warning of potential supply constraints and transmission infrastructure limits over the next five years.

What is ‘sustainable AI’?

Sustainable AI refers to efforts to make algorithms more efficient (requiring less compute to achieve the same result), scheduling training runs during hours when renewable energy is plentiful on the grid, and utilizing advanced cooling techniques to lower a data center’s Power Usage Effectiveness (PUE) ratio.

Qaisar Roonjha

Qaisar Roonjha

AI Education Specialist

Building AI literacy for 1M+ non-technical people. Founder of Urdu AI and Impact Glocal Inc.

Newsletter

Stay ahead of the AI curve.

One email per week. No spam, no hype — just the most useful AI developments, tools, and tactics.