AI × Energy: What’s the Real Ledger?
If the AI conversation has shifted from model sizes to megawatts, you’re not imagining it.
Over the past year, compute has started colliding with electricity, water, steel, and policy.
This note lays out three things with a calm lens: where AI’s footprint really sits, how AI can actually help the grid it stresses, and why China is quietly building the world’s largest “green compute” testbed.
This isn’t a verdict. It’s a working ledger you can challenge, refine, and carry forward.
Where the footprint actually comes from
Let’s start small, then zoom out. Google’s latest figures for Gemini show a typical text prompt uses about 0.24 Wh of electricity, emits roughly 0.03 g CO₂e, and consumes around 0.26 mL of water. That’s helpful context, roughly nine seconds of TV and five drops of water. It’s also just inference.
It does not include weeks of training, or the embodied carbon of chips and buildings, or the concrete, steel, and cooling that sit behind hyperscale data centers. That’s where a lot of the real cost hides.
At the system level, AI data centers are pushing electricity demand in certain regions and reshaping local grids. The IEA expects about $3.3 trillion in energy investment this year, with $2.2 trillion going to clean energy. We will need that scale if AI demand keeps rising and we want it met by low-carbon supply.
Air and water impacts are not theoretical. A months-long investigation in the U.S. links the data-center boom to higher fossil backup, local pollution, and siting choices in water-stressed regions, roughly 40% of big sites by one analysis. Cities from Arizona to Virginia are feeling the pressure.
Bottom line: per-prompt numbers are getting smaller, which is good, but the big climate and community effects come from where AI runs, how it’s cooled, and how flexible it is when the grid is tight.
How AI can stabilize the grid, not just strain it
We’re heading into a world where wind and solar set the tempo. That makes forecasting, balancing, and flexibility the name of the game. Here, AI can be more than a burden.
AI for grid balancing. Better nowcasts for wind and solar, sharper demand forecasts, and smarter control reduce curtailment and help clean portfolios act like reliable portfolios. That unlocks higher renewable shares without sacrificing stability.
Make the “AI factory” flexible. One practical lever is to let data centers follow grid signals. Non-urgent training or inference can step down during peak hours and shift to off-peak or high-renewables windows. Field pilots show AI clusters can often shed about 25% of load for hours with little hit to service. Treating big compute as dispatchable, curtailable load turns it from a grid liability into a grid asset.
Cooling and water are a design problem, not just a cost line. Cooling is where electricity and water meet. Operators are moving toward liquid solutions that support GPU-dense racks without tearing down buildings, and toward designs that cut water use. Expect more direct-to-chip, microfluidic, and refrigerant-agnostic options to show up in real sites.
Heat can be a feature. In colder cities, waste heat is already feeding district-heating networks. Projects in the Nordics show this is commercial where infrastructure exists. In the right places, that “waste” becomes a steady local value stream.
Clean molecules for clean fabs. AI’s “steel” is the chip supply chain. Electrolyzers are starting to supply green hydrogen into semiconductor processes, which helps shrink the embodied footprint of the hardware that runs AI.
Takeaways in one breath: AI can forecast better, operate smarter, and flex demand; flex-ready data centers connect faster and fail safer; and whether AI helps or harms the transition depends on cooling, heat reuse, siting, and honest clean-power procurement, ideally 24/7 matching.
A note of caution worth debating: heavy automation can blunt human judgment during extreme events. That argues for drills, guardrails, and “fail-safe by default” operations, not for slowing innovation.
China’s “green compute” experiment
If you care about AI and energy on a continental scale, you watch China, not because it’s perfect, but because it is building the biggest lab.
The power mix is shifting fast. In April 2025, wind + solar supplied about 26% of China’s electricity, the first month above one quarter. Clean power’s share has been climbing, and early-2025 data show fossil generation easing as renewables rise.
Policies matter for AI’s shape and footprint. National guidance is pressing average PUE toward 1.5 or below, with tighter targets for large or cold-climate builds, and siting rules are steering new capacity under “Eastern Data, Western Computing” toward cooler, renewables-rich western hubs. After a building boom, authorities are also knitting sites into a computing-power network so surplus capacity can be traded and balanced, much like a grid.
China’s carbon market is expanding beyond power into steel, cement, and aluminum, with first compliance covering 2024 emissions and deadlines at the end of 2025. Analysts expect coverage to rise toward about 60% of national emissions, with some sectors moving toward absolute caps in the next couple of years.
There are pinch points. Coal remains a comfort blanket for reliability in stress years, grid bottlenecks still show up, and hydrology is a real constraint. The Sichuan 2022 drought, for example, is a reminder that siting and cooling choices must be water-smart.
A quick set of scenarios to keep in mind:
• Green-Compute Flywheel: strict PUE, cheap storage, and long-distance transmission mean AI clusters co-locate with renewables while the ETS starts to bite.
• Fragmented Plateau: trade frictions and chip constraints slow upgrades; curtailment and bottlenecks linger; coal stays as a backstop.
• Hydro-Stress Detour: drought cycles force emergency dispatch; policy doubles down on HVDC, siting, and dry or closed-loop cooling.
A regional note for ASEAN. Competitive advantage increasingly tracks how fast countries scale solar + batteries and how well they connect their grids. Without interconnection, the region risks falling behind in an AI and robotics world.
The paradox in one line
AI is both a solution and a stressor. It can make grids smarter and cleaner, yet it also drives new demand for electricity and water. Net impact depends on policy and markets, on siting and cooling, and on whether we demand honest accounting that includes training and embodied emissions, not just tidy per-prompt numbers.
What I’m watching next
- Monthly wind and solar shares, and curtailment rates. Is clean generation covering new demand?
- Data-center PUE distributions, the share of power from renewables, and water intensity, not just averages.
- UHV and HVDC buildout, and west-to-east clean-power flows actually delivered.
- The ETS rulebook in heavy industry: coverage, allocation, prices, and any movement toward absolute caps.
- AI load flexibility in the wild: how many megawatts can follow a five-to-fifteen-minute signal?
- Cooling shifts and heat-reuse hookups at real facilities.
Questions for you, my fellow Yingfluencer
If you ran an AI cluster, what share of load would you commit to real-time grid signals, and what SLAs would you need?
Should large AI sites face water-use standards or credits, similar in spirit to 24/7 clean-power matching?
Where should countries draw the line on coal as a “firming” crutch during the transition?
What’s the fairest way to allocate the embodied emissions of chips and buildings across training, inference, and end users?
What questions do you have regarding China's role in green leadership for AI development?
💡 If this piece resonated with you, don’t let it stop here. Share it with someone you care about, repost if you believe others could be inspired, and add your perspective in the comments. And if you haven’t yet subscribed to Yingtelligence, I’d be honored to have you with us on this journey. Together, let’s keep the ENERGY on.
References & Further Reading
Per-prompt energy/water numbers (Google + critiques)
- Measuring the environmental impact of AI inference (Google Cloud blog)
- Measuring the Environmental Impact of Delivering AI at Google Scale (Google technical paper, PDF)
- Google says each AI prompt uses ‘nine seconds of TV’ worth of energy (The Verge)
AI demand, electricity & water
- World Energy Investment 2025 (IEA report)
- Global energy investment to hit a record $3.3 trillion in 2025 (Reuters)
- AI runs on dirty power — and the public pays the price (Business Insider)
- How data centers are deepening the water crisis (Business Insider)
- Methodology: calculating data-center environmental impacts (Business Insider)
Flex-ready data centers (load flexibility)
- AI factories and flexible power use (NVIDIA blog)
- Emerald AI (company site)
- Nvidia and Oracle tapped this startup to flex a Phoenix data center (Latitude Media)
- DCFlex initiative (EPRI)
- Flexible Loads, Resilient Grids (EPRI Journal)
- Why data-center flexibility matters (IEEE Spectrum)
Cooling, heat reuse & siting
- AWS unveils liquid-cooling deployments for high-density racks (About Amazon)
- AWS IRHX liquid-cooling coverage (TechRadar)
- Odense Data Center heat recovery to district heating (Meta sustainability brief, PDF)
- Inside Meta’s Odense heat-recovery build (Meta Engineering)
- Stockholm Data Parks
- Stockholm Exergi — Heat Recovery
Clean molecules for chips (embodied footprint)
China watch: power mix, PUE & “Eastern Data, Western Computing”
- China’s clean-power records and trends (Ember)
- Coal output dips as clean energy surges in April 2025 (Reuters)
- Data-center energy targets and PUE guidance (State Council, EN)
- How China is managing rising data-center energy demand (Carbon Brief)
- China plans a network to sell surplus computing power (Reuters)
- State Grid sets record grid-investment outlays for 2025 (Reuters)
- Pioneering ±800 kV UHVDC clean-power corridor (Hitachi Energy)
Policy & markets: ETS expansion
- China National ETS — country profile and updates (ICAP)
- China to expand national carbon market to heavy industry (Reuters)