This article comes from our ‘AI Infrastructure as a Trojan Horse for Climate Infrastructure’ whitepaper, published October 2025.
TL;DR
Inference-first, modular, distributed AI infrastructure can be designed to accelerate decarbonisation by anchoring new clean power, providing grid flexibility, and turning waste heat into usable community energy.
Credibility comes from basin-level water stewardship and embodied-carbon cuts (low-carbon steel/concrete via standardisation), not vague efficiency claims or generic “WUE” headlines.
To scale responsibly, pair growth with durable carbon removal and enforceable community benefits, so impacts and value are transparently shared over time.
Why “AI infrastructure” Should now Be climate infrastructure
AI infrastructure is physical infrastructure: steel, concrete, power lines, water systems, grid constraints, and community permitting.
As AI grows, the default path is familiar - bigger, more centralised, more resource-intensive. But the shift toward inference-first workloads opens another path: smaller, modular, distributed systems that can be designed to accelerate the climate transition rather than collide with it.
This articles condenses the seven principles outlined in Opna’s whitepaper (pages 20–46): a blueprint for building AI infrastructure that behaves like climate infrastructure: anchoring clean power, stabilising grids, reusing heat, stewarding water, decarbonising materials, integrating carbon removal, and creating durable local benefits.
1) Anchor clean power (and make new renewables financeable)
If AI load growth lands on fossil-heavy grids, emissions rise with it. The first principle is to treat modular inference sites as anchor customers for clean generation: long-term offtake that helps projects pencil, permitting move, and capital commit.
This is already visible in the way corporate clean power procurement has shaped buildouts, with data centres acting as reliable demand that can underwrite new supply (see the data-centre sustainability and procurement framing from Deloitte’s analysis of GenAI power demand and data centre sustainability). A concrete signal of this direction is Google, Intersect Power, and TPG’s investment approach to clean energy for data centres—explicitly linking new compute capacity to new clean generation.
Google, Intersect Power and TPG Rise Climate | Source link
2) Become a flexible grid asset, not a rigid load
Renewables-heavy grids need flexibility: the ability to shift demand in time and respond to local conditions. The second principle reframes data centres—especially modular inference facilities—as grid-interactive resources.
Instead of acting like an always-on, non-negotiable load, facilities should be designed to participate in demand response, load shaping, and grid services—supported by infrastructure like UPS and batteries, and by software that schedules non-urgent compute when the grid is cleanest. The practical playbook is captured in Eaton & Microsoft’s grid-interactive data center whitepaper, which lays out how data centres can support both decarbonisation and stability.
Inference-First Data Centres as Climate Infrastructure
If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.
With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.
This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.
UPS provides fast response to frequency deviation. Response (static 25 kW) activated through external signal once frequency drops below threshold value (49,70 Hz). Power grid frequency (purple) and building load (orange) plotted during one event | Source link
3) Turn waste heat into local value
Almost all power going into a data centre comes out as heat. Today, most of that heat is thrown away.
The third principle is to build for heat reuse—especially where modular, closer-to-demand inference nodes can connect to real “heat sinks” (district heating loops, buildings, pools, industrial processes). A growing set of deployments illustrate what this can look like: Leafcloud’s building-integrated compute that returns usable hot water, Deep Green’s digital boiler approach for swimming pools, and broader industrial symbiosis concepts discussed across the bibliography.
Inference-First Data Centres as Climate Infrastructure
If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.
With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.
This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.
How heat energy reuse works, Microsoft, 2022 | Source link
4) Practice water stewardship with basin-level accountability
Water impact is local, political, and increasingly binding. The fourth principle is to move past generic “WUE good/bad” narratives and commit to basin-aware water stewardship: reduce freshwater withdrawals, increase reuse, disclose clearly, and replenish locally.
The strongest framing here is WRI’s Volumetric Water Benefit Accounting (VWBA) method, which enables credible, comparable water benefit claims. The whitepaper also points to operator-level transparency and basin risk work such as Google’s Water Stewardship project portfolio.
Inference-First Data Centres as Climate Infrastructure
If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.
With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.
This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.
An example of data centre’s operational water usage: on-site scope-1 water usage for data centre cooling (via cooling towers in the example), and off-site scope-2 water usage for electricity generation. The icons for AI models are only for illustration purposes. | Source link
Inference-First Data Centres as Climate Infrastructure
If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.
With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.
This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.
An example of a mid-rise concrete and steel construction showing the embodied carbon reduction by material category, RMI 2021 | Source link
Inference-First Data Centres as Climate Infrastructure
If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.
With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.
This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.
7) Anchor community benefits with enforceable agreements
Infrastructure projects succeed or fail on legitimacy. The seventh principle is to treat AI facilities as civic infrastructure—with visible local value, transparent reporting, and enforceable commitments.
This includes local and gender-balanced hiring pathways, procurement participation, shared resilience benefits, and real governance mechanisms. The strongest tool here is the Community Benefit Agreement (CBA) model, discussed directly in Good Jobs First’s guidance on CBAs for data centers.
Inference-First Data Centres as Climate Infrastructure
If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.
With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.
This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.
What portion of your organization’s data center design, build, or operations staff is women? (n=694). Data centre teams employ around 10% or less women and 20% of organizations employ no women at all. | Source link
AI infrastructure as climate infrastructure
AI infrastructure doesn’t have to be a climate liability. In an inference-first world—where compute can be modular, distributed, and responsive—we can design data centres to function as climate infrastructure: financing new clean power, acting as flexible grid assets, and turning waste heat into local energy.
But credibility depends on what we measure and what we commit to. That means basin-level water stewardship (not generic efficiency claims), embodied-carbon reductions through low-carbon materials and standardised procurement (not bespoke one-offs), and durable carbon removal integrated with transparent accounting (not paper offsets).
Finally, none of this scales without legitimacy. The projects that endure will be the ones that deliver enforceable community benefits—clear agreements, clear reporting, and clear value shared locally. The seven principles are a practical blueprint for building AI infrastructure that earns its place in the climate transition: not adjacent to it, and not extractive from it, but structurally aligned with it.
Inference-First Data Centres as Climate Infrastructure
If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.
With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.
This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.