Environment & Sustainability: Short-term

2026–2028Impacts already visible or imminent | Inequality & Access

Environment & Sustainability: Short-term (2026-2028)

Current State

The environmental footprint of artificial intelligence has become one of the defining tensions of the mid-2020s. AI systems demand enormous quantities of electricity, water, and rare materials -- and this demand is growing at a pace that is outstripping the clean energy transition in many regions. Simultaneously, AI is proving to be a powerful tool for climate modeling, grid optimization, and materials discovery. The question is not whether AI helps or harms the environment, but whether the net effect can be steered toward sustainability before critical climate thresholds are crossed.

Data center energy consumption is surging. The International Energy Agency reported in January 2024 that global data center electricity consumption was approximately 460 TWh in 2022, roughly 2% of global electricity demand. By 2026, IEA projections indicate this could reach 800-1,000 TWh -- approaching the total electricity consumption of Japan. Goldman Sachs estimated in 2024 that AI alone could drive a 160% increase in data center power demand by 2030. The energy intensity of AI workloads dwarfs traditional computing: training a single large language model (such as GPT-4 or Llama 3) consumes an estimated 50-100 GWh of electricity, equivalent to the annual consumption of roughly 5,000-10,000 US households. Inference -- running these models in production at scale -- is cumulatively even larger, with some estimates suggesting that inference accounts for 60-80% of total AI energy use.

The carbon footprint depends entirely on grid mix. Google's 2024 Environmental Report revealed that its total greenhouse gas emissions rose 48% compared to 2019, driven almost entirely by data center expansion for AI. Microsoft reported a 29% increase in emissions since 2020, despite its 2030 carbon-negative pledge. Meta's emissions followed a similar upward trajectory. These figures reflect the inconvenient reality that even companies with aggressive renewable procurement targets cannot fully match AI growth with clean energy in real time. When data centers operate on grids powered by natural gas or coal -- as many do in Virginia, Texas, and parts of Southeast Asia -- every AI query carries a meaningful carbon cost.

Water consumption is a growing but underappreciated concern. Large data centers use evaporative cooling systems that consume 1-5 million gallons of water per day. A 2021 study published in Nature estimated that a single large AI model training run could consume over 700,000 liters of fresh water for cooling. Google's US data centers consumed approximately 5.6 billion gallons of water in 2023, a 17% year-over-year increase. In drought-prone regions like the American Southwest, Chile, and parts of India, AI data center water draw competes directly with agricultural and residential needs.

Semiconductor manufacturing carries its own environmental burden. TSMC, which fabricates the vast majority of advanced AI chips, is Taiwan's single largest industrial consumer of water and electricity. The production of a single advanced AI chip (such as Nvidia's H100 or B200) involves hundreds of process steps using toxic chemicals, ultrapure water, and significant energy. As chip demand scales with AI adoption, semiconductor fabrication becomes an increasingly significant environmental pressure point.

Key Drivers

  1. Exponential growth in AI inference demand. As AI is embedded into search, productivity software, healthcare, and consumer applications, the cumulative energy cost of inference vastly exceeds training costs. Each ChatGPT-style query uses roughly 10x the energy of a traditional Google search, and query volumes are measured in billions per day.

  2. Hyperscaler infrastructure race. Microsoft, Google, Amazon, and Meta are each spending $40-60 billion annually on data center capital expenditure as of 2025-2026. This arms race is driven by competitive pressure to capture AI market share, not by efficiency considerations. New data center campuses are being built on timelines (2-4 years) that outpace utility-scale renewable energy projects (4-7 years for permitting and construction).

  3. Nuclear energy renaissance for AI. Microsoft signed a deal to restart Three Mile Island's Unit 1 reactor specifically to power AI data centers. Google and Amazon have invested in small modular reactor (SMR) companies. This signals that hyperscalers recognize renewables alone cannot meet their baseload AI power needs and are turning to nuclear as a zero-carbon alternative, despite 10-15 year construction timelines for new plants.

  4. AI-for-climate applications gaining traction. Google DeepMind's weather forecasting model (GraphCast) demonstrated in 2023 that it could predict weather 10 days out more accurately than the European Centre for Medium-Range Weather Forecasts' operational model, at a fraction of the computational cost. AI-driven grid optimization has been shown to reduce energy waste by 10-20% in pilot deployments. Precision agriculture AI systems can reduce fertilizer and water use by 15-30%.

  5. Regulatory and disclosure pressure. The EU's Corporate Sustainability Reporting Directive (CSRD) now requires large companies to disclose Scope 1, 2, and 3 emissions, including data center energy use. The SEC's 2024 climate disclosure rules (though legally contested) push US tech firms toward transparency. This regulatory pressure is making AI's environmental footprint politically salient for the first time.

Projections

2026-2028 environmental trajectory:

  • Data center electricity consumption will likely reach 900-1,200 TWh globally by 2028, representing 3.5-4.5% of global electricity demand. The US alone may see data centers consuming 6-9% of national electricity, up from approximately 4% in 2023.
  • Emissions trajectory for major hyperscalers will continue rising through 2028 despite renewable procurement, because new data center capacity is coming online faster than dedicated clean energy supply. Google and Microsoft are unlikely to meet their 2030 net-zero operational targets without massive reliance on carbon offsets of questionable quality.
  • Water stress conflicts will emerge in at least 3-5 regions globally where AI data center expansion collides with drought conditions and competing water needs. Northern Virginia, central Texas, and parts of Spain and India are early flashpoints.
  • AI-for-climate tools will demonstrate clear value at pilot scale but remain marginal in their impact on global emissions. The net environmental effect of AI remains negative in this period -- the energy consumed by AI systems substantially exceeds the emissions avoided through AI-enabled climate solutions.

Impact Assessment

Who bears the environmental cost (2026-2028):

  • Communities near data centers face localized impacts: noise, heat island effects, water competition, and electricity price increases. These communities are disproportionately lower-income and rural, creating an environmental justice dimension.
  • Developing nations bear downstream climate costs from emissions generated to power AI systems that primarily serve wealthy-country users. The benefits of AI-for-climate tools accrue first to nations with the technical capacity to deploy them, widening the gap.
  • Water-stressed regions face the most immediate physical harm. Data center operators in drought-prone areas may have legal water rights but their consumption intensifies scarcity for agriculture and households.
  • Renewable energy supply chains are strained by AI demand, potentially slowing the broader clean energy transition. When a tech company signs a 20-year power purchase agreement for a new solar farm, that clean energy capacity is no longer available to decarbonize the rest of the grid.

Cross-Dimensional Effects

  • Geopolitics: Control of energy supply for AI becomes a strategic asset. Nations with abundant clean energy (Norway, Canada, Iceland) attract data center investment, while energy-poor nations fall further behind in AI capability. Semiconductor manufacturing concentration in Taiwan creates both environmental and geopolitical vulnerability.
  • Ethics & Regulation: The tension between AI innovation and environmental protection forces regulatory choices. Should governments cap data center energy use? Mandate renewable-only AI operations? These questions are entering policy debates in the EU and California.
  • Economic Models: The economic externality of AI's carbon footprint is largely unpriced. If carbon pricing tightened to levels consistent with Paris Agreement targets ($150-300/ton), the cost of AI inference would rise significantly, potentially altering business models.
  • Digital Divide: Energy-intensive AI infrastructure concentrates in wealthy nations with reliable grids, excluding developing nations from hosting competitive AI operations and widening the global AI capability gap.
  • Healthcare: AI-powered drug discovery and diagnostic tools have significant potential health benefits, but their energy footprint raises questions about whether healthcare AI deployment should be prioritized over other AI use cases in a carbon-constrained world.

Actionable Insights

For policymakers:

  • Require transparent, standardized reporting of AI energy consumption, water use, and carbon emissions from data center operators. Current voluntary disclosures are inconsistent and incomplete.
  • Link data center construction permits to clean energy commitments -- require operators to bring equivalent renewable capacity online within defined timelines rather than relying on distant offsets.
  • Invest in grid modernization to accommodate AI data center loads without displacing residential and industrial clean energy access.

For technology companies:

  • Prioritize inference efficiency research (model distillation, quantization, sparse architectures) as aggressively as capability research. A 50% reduction in inference energy per query would have greater environmental impact than any corporate renewable energy purchase.
  • Co-locate data centers with dedicated clean energy generation rather than drawing from existing grids and backfilling with renewable energy credits.
  • Publish detailed water consumption data and invest in closed-loop cooling systems that eliminate freshwater dependency.

For individuals and organizations:

  • Recognize that AI usage has an environmental cost. Choosing efficient models, reducing unnecessary API calls, and favoring providers with verifiable clean energy commitments are meaningful actions at scale.
  • Support policies that internalize AI's environmental costs rather than externalizing them to communities and the climate.

Sources & Evidence

  • IEA, "Electricity 2024" report -- global data center electricity consumption projections (460 TWh in 2022, 800-1,000 TWh projected)
  • Goldman Sachs, "AI Poised to Drive 160% Increase in Data Center Power Demand" (2024)
  • Google 2024 Environmental Report -- 48% emissions increase since 2019, 5.6 billion gallons US water consumption
  • Microsoft 2024 Sustainability Report -- 29% emissions increase since 2020
  • Meta 2024 Sustainability Report -- emissions trajectory data
  • Strubell, Ganesh & McCallum, "Energy and Policy Considerations for Deep Learning in NLP" (2019, updated analysis)
  • Li et al., "Making AI Less Thirsty," Nature Water (2021) -- water consumption estimates for AI training
  • SemiAnalysis, "The Inference Cost of Search Disruption" -- inference vs. training energy analysis
  • BCG, "How AI Can Be a Powerful Tool in the Fight Against Climate Change" (2024)
  • World Resources Institute, "Artificial Intelligence and Climate Change" analysis
  • Google DeepMind, GraphCast weather prediction results (2023)
  • US Department of Energy, AI deployment actions for grid optimization
  • TSMC Environmental Reports -- semiconductor manufacturing water and energy data
  • EPRI, "Powering Intelligence: The Impact of AI on the Power Sector" (2024)
  • BloombergNEF New Energy Outlook -- energy transition forecasts