Sustainable AI: Energy, Water, and Efficiency

Avatar
Lisa Ernst · 21.09.2025 · Technology · 5 min

The environmental impact of artificial intelligence (AI) is increasingly discussed. This article examines the energy and water consumption of AI operations, based on current studies and reports.

Introduction to Green AI

Green AI describes the measurable environmental impact of AI operation, particularly the electricity (energy footprint) and water footprint during the training and inference of models. Data centers use two metrics for this: PUE (Power Usage Effectiveness) as the ratio of total power to IT power, where values closer to 1 indicate higher efficiency. The WUE (Water Usage Effectiveness) measures annual water consumption divided by IT energy in kWh; lower values signify higher water efficiency. Water consumption often refers to "consumptive" water that evaporates and is not returned, for example with adiabatic or evaporative cooling (airatwork.com).

Current Landscape and Developments

The IEA projects that global electricity demand from data centers will rise to about 945 TWh by 2030, with AI as the main driver. This corresponds to more than a doubling compared to today. AI-specialized sites could even quadruple their share by 2030 (iea.org). The detailed IEA report "Energy and AI" analyzes methods, data quality, and regional differences. At the same time, Microsoft reports new data-center designs that claim "zero water for cooling" through closed-loop systems and chip-near cooling (microsoft.com). Google describes how efficiency gains in electricity often also reduce water demand, but emphasizes dependence on climate and cooling technology (cloud.google.com; google.com). Reports about the water needs of new hyperscaler sites in drought-prone regions raise debates about water rights (reuters.com). Research on water footprints of AI since 2023 has highlighted the scales and methodological uncertainties (arxiv.org; dl.acm.org).

A detailed examination of the emissions sources of data centers is crucial for developing effective reduction strategies.

Quelle: fiberopticom.com

A detailed examination of the emissions sources of data centers is crucial for developing effective reduction strategies.

Drivers of Increased Consumption

The rise in electricity and water consumption has several causes. First, AI workloads are moved to high-density data centers. This can lower PUE but raise WUE when evaporation contributes to efficiency (thegreengrid.org; google.com). Second, new accelerator generations improve performance per watt but increase absolute demand as workloads become faster and more frequent. Blackwell promises efficiency gains over Hopper without relieving the demand pull (theregister.com; nvidia.com). Third, site factors such as cooling technology, climate, power mix, water availability, and regulation influence whether operators save electricity or water — rarely both at once (google.com; iea.blob.core.windows.net).

Quelle: YouTube

Google explains the use of treated channel water as a drinking-water protection measure.

Fact Check: Evidence vs. Claims

The electricity demand from data centers is rising strongly; AI is a major driver. PUE and WUE are defined metrics for energy and water efficiency (thegreengrid.org; thegreengrid.org). Training and inference can involve significant water usage, depending on location, season, and cooling technology (arxiv.org; google.com).

The promise of "Zero Water for Cooling" is an operating claim. Lifecycle water (e.g., semiconductor fabrication) and non-cooling water are not included, as are site relocations under extreme heat (cloud.google.com).

The claim “Liquid cooling automatically solves the water problem” is misleading. It can improve WUE, but it does not have to; effects depend on design and make-up of the rear cooling system. Also the assumption “AI is virtual and has no physical impact” is false. Energy and water balances are measurable and correlate, among other things, with token length and utilization (arxiv.org).

The energy hunger of data centers: A comparison to the electricity consumption of entire countries highlights the scale of the problem.

Quelle: device42.com

The energy hunger of data centers: A comparison to the electricity consumption of entire countries highlights the scale of the problem.

Industry Responses and Counterarguments

Industry argues with efficiency pathways: improved PUE/WUE, water recycling and design shifts toward water-free cooling in operation (microsoft.com; cloud.google.com). Research and NGOs call for transparency and robust disclosure of location, season, and technology due to displacement effects and drought risks (arxiv.org; reuters.com). Energy economists see efficiency gains from accelerators but warn about the rebound effect: more efficiency lowers costs and can raise utilization (theregister.com; iea.blob.core.windows.net).

Practical Implications and Recommendations

Pragmatic action requires: first, measurement. For inference, studies provide metrics like “Energy per Token” as a metric; energy correlates strongly with token length and latency (euromlsys.eu). Second, load consolidation, but not blindly: batch size and prompt design can reduce energy per token; too-large batches can flip efficiency due to overhead (upm.es; techrxiv.org). Third, adjust models and precisions: quantization to 4-bit is practicable for many tasks and saves compute and memory energy; quality checks remain mandatory (openreview.net; arxiv.org). Fourth, choose hardware appropriately: on edge/client, NPUs help with performance per watt; in data centers, specialized accelerators deliver better efficiency than general-purpose CPUs (qualcomm.com; nvidia.com). Fifth, actively manage location & cooling: climate, power mix, WUE-/PUE targets, water quality (e.g., non-potable instead of drinking water) and waste heat reuse should be incorporated early in sourcing and architecture (google.com; thegreengrid.org).

NPU chips are key technologies for energy-efficient AI applications, especially in the edge computing domain.

Quelle: ai-market.jp

NPU chips are key technologies for energy-efficient AI applications, especially in the edge computing domain.

Quelle: YouTube

A brief overview of the electricity question with concise visualizations.

Open Questions and Future Outlook

How reliable are today’s disclosures about water and electricity when climate extremes increase and load profiles change? Standardized, location- and season-resolved reports on WUE, PUE, energy/water sources, and load management would be needed (thegreengrid.org; iea.blob.core.windows.net). Which regulation sets sensible guardrails without stifling innovation – for example water usage limits in drought-prone regions? Ongoing debates and expert discussions show how dynamic this field is (iea.org).

Conclusion

Green AI starts with an honest balance sheet: measure, compare, and control. Those who tie PUE and WUE together, track energy per token, reduce precision where appropriate, choose batch sizes wisely, and plan locations and cooling with intention will lower costs and environmental impact – and gain robust arguments for technology and investment decisions (aclanthology.org; upm.es; openreview.net; google.com).

Teilen Sie doch unseren Beitrag!