Ecosystems

Insights

Ecosystems

Insights

Ecosystems

Insights

Ecosystems

Insights

Opna Data Centres AI Inference
Opna Data Centres AI Inference
Opna Data Centres AI Inference
Opna Data Centres AI Inference

AI Data Centres: The 7 principles for building AI infrastructure as climate infrastructure

AI Data Centres: The 7 principles for building AI infrastructure as climate infrastructure

10th December 2025

This article comes from our ‘AI Infrastructure as a Trojan Horse for Climate Infrastructure’ whitepaper, published October 2025. 

TL;DR

  • Inference-first, modular, distributed AI infrastructure can be designed to accelerate decarbonisation by anchoring new clean power, providing grid flexibility, and turning waste heat into usable community energy.

  • Credibility comes from basin-level water stewardship and embodied-carbon cuts (low-carbon steel/concrete via standardisation), not vague efficiency claims or generic “WUE” headlines.

  • To scale responsibly, pair growth with durable carbon removal and enforceable community benefits, so impacts and value are transparently shared over time.

Why “AI infrastructure” Should now Be climate infrastructure


AI infrastructure is physical infrastructure: steel, concrete, power lines, water systems, grid constraints, and community permitting.


As AI grows, the default path is familiar - bigger, more centralised, more resource-intensive. But the shift toward inference-first workloads opens another path: smaller, modular, distributed systems that can be designed to accelerate the climate transition rather than collide with it.


This articles condenses the seven principles outlined in Opna’s whitepaper (pages 20–46): a blueprint for building AI infrastructure that behaves like climate infrastructure: anchoring clean power, stabilising grids, reusing heat, stewarding water, decarbonising materials, integrating carbon removal, and creating durable local benefits.

1) Anchor clean power (and make new renewables financeable)


If AI load growth lands on fossil-heavy grids, emissions rise with it. The first principle is to treat modular inference sites as anchor customers for clean generation: long-term offtake that helps projects pencil, permitting move, and capital commit.


This is already visible in the way corporate clean power procurement has shaped buildouts, with data centres acting as reliable demand that can underwrite new supply (see the data-centre sustainability and procurement framing from Deloitte’s analysis of GenAI power demand and data centre sustainability). A concrete signal of this direction is Google, Intersect Power, and TPG’s investment approach to clean energy for data centres—explicitly linking new compute capacity to new clean generation.

Google, Intersect Power and TPG Rise Climate | Source link

2) Become a flexible grid asset, not a rigid load


Renewables-heavy grids need flexibility: the ability to shift demand in time and respond to local conditions. The second principle reframes data centres—especially modular inference facilities—as grid-interactive resources.


Instead of acting like an always-on, non-negotiable load, facilities should be designed to participate in demand response, load shaping, and grid services—supported by infrastructure like UPS and batteries, and by software that schedules non-urgent compute when the grid is cleanest. The practical playbook is captured in Eaton & Microsoft’s grid-interactive data center whitepaper, which lays out how data centres can support both decarbonisation and stability.

Inference-First Data Centres as Climate Infrastructure


If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.


With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.


This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.

UPS provides fast response to frequency deviation. Response (static 25 kW) activated through external signal once frequency drops below threshold value (49,70 Hz). Power grid frequency (purple) and building load (orange) plotted during one event | Source link

3) Turn waste heat into local value


Almost all power going into a data centre comes out as heat. Today, most of that heat is thrown away.


The third principle is to build for heat reuse—especially where modular, closer-to-demand inference nodes can connect to real “heat sinks” (district heating loops, buildings, pools, industrial processes). A growing set of deployments illustrate what this can look like: Leafcloud’s building-integrated compute that returns usable hot water, Deep Green’s digital boiler approach for swimming pools, and broader industrial symbiosis concepts discussed across the bibliography.

Inference-First Data Centres as Climate Infrastructure


If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.


With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.


This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.

How heat energy reuse works, Microsoft, 2022 | Source link

4) Practice water stewardship with basin-level accountability


Water impact is local, political, and increasingly binding. The fourth principle is to move past generic “WUE good/bad” narratives and commit to basin-aware water stewardship: reduce freshwater withdrawals, increase reuse, disclose clearly, and replenish locally.


The strongest framing here is WRI’s Volumetric Water Benefit Accounting (VWBA) method, which enables credible, comparable water benefit claims. The whitepaper also points to operator-level transparency and basin risk work such as Google’s Water Stewardship project portfolio.

Inference-First Data Centres as Climate Infrastructure


If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.


With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.


This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.

An example of data centre’s operational water usage: on-site scope-1 water usage for data centre cooling (via cooling towers in the example), and off-site scope-2 water usage for electricity generation. The icons for AI models are only for illustration purposes. | Source link

5) Build with low-carbon materials (and standardise so it scales)


Even if the grid decarbonises, the embodied carbon in the buildout—steel, cement, site works—can dominate near-term climate impact. The fifth principle is to treat embodied carbon as a first-class constraint, and use modularity to make low-carbon choices replicable.


The challenge (and opportunity) is outlined well in Uptime Institute’s work on the long journey of concrete and steel decarbonisation and the wider building-sector push for embodied carbon reduction from WorldGBC. For data centre-specific supply chain strategies, the bibliography includes ERM’s circular construction approach for data centres.

Inference-First Data Centres as Climate Infrastructure


If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.


With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.


This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.

An example of a mid-rise concrete and steel construction showing the embodied carbon reduction by material category, RMI 2021 | Source link

6) Integrate durable carbon removal (without using it as a crutch)


Efficiency and clean power are necessary—but not always sufficient, especially given rebound risk (the Jevons paradox problem the bibliography flags). The sixth principle is to integrate durable carbon dioxide removal (CDR) as infrastructure—co-sited, contracted, and accountable—rather than as distant offsets stapled onto an emissions inventory.


The whitepaper points to how the data centre buildout could catalyse removal markets, including analysis likeLatitude Media on whether the data center boom boosts carbon removal, and to specific removal/material pathways such as Paebbl’s mineralisation approach to storing CO₂ in materials. It also includes siting-and-supply examples relevant to firm clean power, like Fervo’s enhanced geothermal + data center corridor framing.

Inference-First Data Centres as Climate Infrastructure


If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.


With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.


This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.

7) Anchor community benefits with enforceable agreements


Infrastructure projects succeed or fail on legitimacy. The seventh principle is to treat AI facilities as civic infrastructure—with visible local value, transparent reporting, and enforceable commitments.


This includes local and gender-balanced hiring pathways, procurement participation, shared resilience benefits, and real governance mechanisms. The strongest tool here is the Community Benefit Agreement (CBA) model, discussed directly in Good Jobs First’s guidance on CBAs for data centers.

Inference-First Data Centres as Climate Infrastructure


If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.


With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.


This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.

What portion of your organization’s data center design, build, or operations staff is women? (n=694). Data centre teams employ around 10% or less women and 20% of organizations employ no women at all. | Source link

AI infrastructure as climate infrastructure


AI infrastructure doesn’t have to be a climate liability. In an inference-first world—where compute can be modular, distributed, and responsive—we can design data centres to function as climate infrastructure: financing new clean power, acting as flexible grid assets, and turning waste heat into local energy.


But credibility depends on what we measure and what we commit to. That means basin-level water stewardship (not generic efficiency claims), embodied-carbon reductions through low-carbon materials and standardised procurement (not bespoke one-offs), and durable carbon removal integrated with transparent accounting (not paper offsets).


Finally, none of this scales without legitimacy. The projects that endure will be the ones that deliver enforceable community benefits—clear agreements, clear reporting, and clear value shared locally. The seven principles are a practical blueprint for building AI infrastructure that earns its place in the climate transition: not adjacent to it, and not extractive from it, but structurally aligned with it.

Inference-First Data Centres as Climate Infrastructure


If training demands centralisation, inference opens the door to distribution. The shift from training-first to inference-first architectures isn’t just a technical shift — it’s a generational opportunity. Inference unlocks a new model: smaller, distributed compute that can be designed as climate-aligned, community-aligned infrastructure.


With intentional design, such as embedding Opna's core pillars of climate-aligned infrastructure, these centres can underwrite new clean energy projects, such as the multi-billion-dollar clean energy partnerships for data centres; drive demand for low-carbon materials, evidenced by growing investment in low-carbon cement production; scale carbon removal and water replenishment through emerging solutions like mineralisation-based CO₂ storage and large-scale water stewardship projects; and embed circular heat reuse in industries and communities, demonstrated by initiatives that use data centre heat to warm greenhouses.


This is the real promise of the inference era: infrastructure that is not only technologically efficient but socially and ecologically productive.

Read more about creating AI infrastructure as climate infrastructure in our latest whitepaper, or contact us directly at hi@opna.earth.

Opna white paper AI infrastructure as climate infrastructure
Opna white paper AI infrastructure as climate infrastructure
Opna white paper AI infrastructure as climate infrastructure
Opna white paper AI infrastructure as climate infrastructure

Keep reading…

Keep reading…

Keep reading…

AI Infrastructure: Hyperscale Growth - And Its Limits

© 2025 Salt Global UK Limited. All rights reserved.

© 2025 Salt Global UK Limited. All rights reserved.

© 2025 Salt Global UK Limited. All rights reserved.

© 2025 Salt Global UK Limited. All rights reserved.