AI’s complicated
relationship with energy
Bad actor, good actor: the race to hyperscaling and
the promise of AI for energy transition
Paolo Missier
Director, the Institute for Data and AI
p.missier@bham.ac.uk
Outline and key messages
The Institute for Data and AI
• supporting data-intensive interdisciplinary research, inclusively and proactively
Bad actor: AI driving energy demand
• Technical and business drivers call for indefinite scaling
• Can AI run on renewable energy?
Good actor: AI driving optimisations in the renewable energy space
• The many ways that AI models are being deployed to support the uptake of
renewable energy
The IDAI is here to:
Connect and engage both internally and externally
Instigate new interdiscliplinary research collaborations
- Grounded in Data, Data Science, and AI
Encourage out-of-the-box thinking
Ensure Data, Data Science and AI are co-developed and deployed
competently
Help researchers understand and respond to the challenges
around Data and AI from the private, public, and third sector
Challenge-driven interdisciplinary
research
Research challenges
Funders
priorities
Research themes
Institute for Data and
AI
Academic research / education
ecosystem
Emergin
g
Researc
h
Thematic
Universit
y Centres
Thematic
Collaborative
Networks
. Clear identity, focus,
plan
. Leadership
. Focus on grant
readiness
IDAI Affiliates
IDAI Fellows
(Theme leads)
Extended
networks
Research PostDocs
Data Scientists
Research Support team
IDAI Affiliates Campaign 2025
Total number of affiliates: 125 (slowing increasing)
6 are from the School of Engineering
Top-level Research Themes
Health, Environment,
and Resilience
- Us and our environment
- Improving healthcare for all
- Environmental sustainability
- Ecological management
- Personalised medicine
- Public health monitoring
- Environmental biosciences
Creative and Cultural
Digital Futures
- Support and enhance human expression
- AI for cultural practices
- Creative industries
- Transform humanities research
- Critical cultural analysis
- Artistic innovation
- Creativity enriches technological progress
Digital
Governance,
Economy, and
Social Impact
- Reshaping public institutions
- Reshaping market systems
- AI impact on society
- Modernise state services
- Enhance financial systems
- Strengthen national security
Foundational Data Science and AI Methodologies
- Advancing core AI techniques
- Data-centric AI
- Data Engineering for AI
- Develop robust, transparent
- AI Ethics
- Safe and Secure AI
- AI for engineering
- Materials research
- Optimising manufacturing
- Innovative design solutions
- Next generation materials
‑
Advanced Engineering,
Materials Science, and
Smart Manufacturing
Energy research: intrinsically multidisciplinary
Health, Environment,
and Resilience
- Us and our environment
- Improving healthcare for all
- Environmental sustainability
- Ecological management
- Personalised medicine
- Public health monitoring
- Environmental biosciences
Creative and Cultural
Digital Futures
- Support and enhance human expression
- AI for cultural practices
- Creative industries
- Transform humanities research
- Critical cultural analysis
- Artistic innovation
- Creativity enriches technological progress
Digital
Governance,
Economy, and
Social Impact
- Reshaping public institutions
- Reshaping market systems
- AI impact on society
- Modernise state services
- Enhance financial systems
- Strengthen national security
Foundational Data Science and AI Methodologies
- Advancing core AI techniques
- Data-centric AI
- Data Engineering for AI
- Develop robust, transparent
- AI Ethics
- Safe and Secure AI
- AI for engineering
- Materials research
- Optimising manufacturing
- Innovative design solutions
- Next generation materials
‑
Advanced Engineering,
Materials Science, and
Smart Manufacturing
- energy transition
- sustainable
urban energy systems
- energy materials
- storage
technologies
- smart grids
- energy efficiency
- energy policy
- fair transitions
- energy poverty
- Demand / supply forecasting
- grid optimisation / stability
- Smart demand response
- Hybrid energy system (HES) optimization
Key messages
• The Institute for Data and AI
• supporting data-intensive interdisciplinary research, inclusively and proactively
• AI driving energy demand
• Technical and business drivers for indefinite scaling
• Can AI run on renewable energy?
• AI driving optimisations in the renewable energy space
• The many ways that AI models are being deployed to support the uptake of
renewable energy
How much energy does AI consume? / Training
[KCH+20] Kaplan, J., McCandlish, S., Henighan, T., Brown, T. B., Chess, B., Child, R., ... & Amodei, D. (2020).
Scaling laws for neural language models. arXiv preprint arXiv:2001.08361.
- Consistent trends
spanning more than six
orders of magnitude
one PF-day = 1015
× 24 × 3600 = 8.64 × 1019
floating point operations.
Large models vs data size and compute time
[KCH+20]
The case for Large models
Smin is the minimum number of steps necessary to reach L
Larger models converge better and faster
Larger models make better use of a given
training samples budget
The incentives are all for scaling up
- AI As a Service → Service Level competition → performance KPI
“Taken together, these results show that language modeling performance improves
smoothly and predictably as we appropriately scale up model size, data, and compute.
We expect that larger language models will perform better and be more sample
efficient than current models.” [KCH+20]
Increase dataset size + increase model complexity + increase compute
- AI As a Service → Subscription business model → scale up on
users/requests (number and complexity of inference)
How much energy does AI consume? / Inference
Why focus on energy consumption during inference?
• Energy used to serve request vastly exceeds energy required for training
• ChatGPT: 13 million daily visitors during Jan. 2023 → 400 million queries/ month →
@3.96 Wh / request --> 1500 MWh / month
• The energy consumed to serve request for one month (2023) is higher than the
energy required to train GPT-3 [AP24]
• Serving requests has a constantly high energy demand profile
• In contrast, training can be deferred to times when energy is generated from a
higher proportion of low-carbon or renewable sources
[AP24] M. F. Argerich and M. Patiño-Martínez, "Measuring and Improving the Energy Efficiency of Large Language Models
Inference," in IEEE Access, vol. 12, pp. 80194-80207, 2024, doi: 10.1109/ACCESS.2024.3409745.
Energy consumption / token
Source: [AP24]
And including embodied carbon emissions… [NZD+24]
• Operational carbon emissions (CO2eq)
• Serving one prompt in ChatGPT generates more than 4 grams of CO2eq [Won23] –
over 20 times the carbon emission of a web search query
• Embodied carbon emissions owing to the manufacturing process of the
hardware
• The race to newer GPUs has a hidden cost!
[NZD+24] Sophia Nguyen, Beihao Zhou, Yi Ding, and Sihang Liu. 2025. Towards Sustainable Large Language Model
Serving. SIGENERGY Energy Inform. Rev. 4, 5 (December 2024), 134–140. https://doi.org/10.1145/3727200.3727220
[Won23] Vinnie Wong. Gen AI’s environmental ledger: A closer look at the carbon footprint of ChatGPT.
https://piktochart.com/blog/carbon-footprint-of-chatgpt/, 2023.
• Older GPUs may be more efficient for certain workloads (kinds of tasks)
• Also sticking to older GPUs for longer helps amortize embodied carbon emissions
• There is often a trade-off between throughput and energy efficiency
Carbon cost of AI inference
𝐶prompt,op = 𝐸prompt · 𝐶𝐼
Québec (QC), California Independent System Operator (CISO), PacifiCorp East (PACE).
A higher fraction of renewable energy sources leads to a lower CI.
Energy to respond to prompt
Carbon Intensity of the grid (region-specific)
Embodied Carbon
Operational Carbon
𝐶prompt,em = 𝑡prompt / LT · 𝐶em
LT: GPU lifetime (5 years)
𝑡prompt: time required to answer 1 prompt
Carbon emissions per region and GPU type
Per-prompt carbon emission under the QC, CISO, and PACE grids (1B-parameter LLaMA).
Google pushing back… (August 2025)
We estimate the median Gemini Apps text prompt uses 0.24
watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide
equivalent (gCO2e), and consumes 0.26 milliliters (or about five
drops) of water —
figures that are substantially lower than many public estimates.
The per-prompt energy impact is equivalent to watching TV for
less than nine seconds.
Over a recent 12 month period, the energy and total
carbon footprint of the median Gemini Apps text
prompt dropped by 33x and 44x, respectively, all while
delivering higher quality responses.
https://cloud.google.com/blog/products/infrastructure/measuring-the-environmental-impact-of-ai-inference/
Measuring the environmental impact of delivering AI at Google Scale, Cooper Elsworth, Keguo Huang, David Patterson,
Ian Schneider et al, August 2025 https://arxiv.org/abs/2508.15734
Energy-efficient neural network training
• (NAS): Neural Architecture Search
• Automatically explore a space of alternative architecture optimising for efficiency
• Network Pruning
• Insight: neurons may have small saliency when their removal minimally affects the
model output/loss. Removing those results in a sparse computational graph
• Model distillation / Teacher-student model / transfer learning
• Teacher model pre-trained on large data and on a generic task to facilitate learning
on the target task
• Quantization
• Reduce the precision of the numerical representations used to store model weights
and activations. Achieves compression, faster inference, and thus lower energy
Gholami, A., Kim, S., Dong, Z., Yao, Z., Mahoney, M. W., & Keutzer, K. (2022). A survey of quantization methods
for efficient neural network inference. In Low-power computer vision (pp. 291-326). Chapman and Hall/CRC.
Quantization and energy / latency trade-off
Source: [AP24]
The race is on…
Nvidia said …Big Tech companies were adding tens of
thousands of its latest GPUs to their systems every week
NVIDIA’s “Rubin Ultra” systems will cram more than 500 GPUs
into a single rack that will consume 600 kilowatts of power
OpenAI envisages facilities that will go “way beyond” 10GW of
power, requiring “new technologies and new construction”.
Sasha Luccioni, AI and climate lead at open-source start-up
Hugging Face, said alternative techniques to train AI models,
such as distillation or the use of smaller models, were gaining
popularity and could allow developers to build powerful
models at a fraction of the cost.
Data Centres - hyperscalers
Small: 5,000–20,000 sq ft / 500-2,000 servers / 1–5MW.
Average: 20,000-100,000 sq ft / 2,000-5,000 servers / 100MW.
Hyperscale: > 5,000 servers / millions of sq ft / > 100MW
Next tier:
Alibaba Cloud
Apple (8 DC in the US, Europe and China)
IBM Cloud
Meta: 21 DC, total space > 50M sq ft
Oracle Cloud Infrastructure (OCI)
Baidu
Top tier:
Amazon Web Services (AWS)
Microsoft Azure
Google Cloud Platform (GCP)
https://www.ibm.com/think/topics/hyperscale-data-center (3/2024)
The race for AI capacity
Microsoft, Alphabet, Amazon and Meta capital expenditures plans >$300bn in 2025
Estimated total expenditure in 2025: $475bn (Gartner), up 42% on 2024
https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/ai-power-expanding-data-center-capacity-to-meet-growing-demand
AI transforms data centre design
https://ig.ft.com/ai-data-centres/ (7/2025)
Reshaping the data centre footprints
Power capacity of the new data centres
Clustering -->
latency reduction
Crusoe is building eight data centre buildings, drawing a total of
1.2GW of capacity, in Abilene, Texas, for OpenAI.
Abilene is part of OpenAI’s proposed $100bn Stargate project
Meta is building a 2GW facility in Richland, Louisiana
“Twenty years ago, a big data centre would have been 20
megawatt”, said Crusoe’s Lochmiller, referring to the amount of
electricity required. “Today, a big data centre is a gigawatt or
more . . . We believe we can build multi-gigawatt campuses.”
xAI is targeting 1.2GW across several sites in Memphis, Tennessee.
Amazon is developing a 2.2GW site for Anthropic in New Carlisle, Indiana.
Breakdown by region
(*) excludes China
Energy usage forecasts
(*) 945 TWh is close to electricity
consumption of Japan
All hyperscalers have struck landmark deals for
nuclear power in the past year.
Microsoft is working with Constellation Energy
to restart the Three Mile Island nuclear plant in
Pennsylvania.
Plus
- Water consumption…
- Cooling
- …
(see full article)
What sources of energy are being used?
https://www.ft.com/content/0f6111a8-0249-4a28-aef4-1854fc8b46f1
Impact on national grids
Ireland is Europe’s capital for hyperscaler cloud data providers
• Data centres account for more than 20% the country’s electricity demands (*)
• More than half of power production in the grid comes from fossil fuels
Hyperscalers report their energy use is on track to be, “100 per
cent matched” with renewable energy at a global scale in 2025.
However, these are clean energy credits --- not directly linked
to data centre energy supply
(*) Data centres will account for up to 30 per cent of power demand in
Malaysia, 14 per cent in the Philippines and 12 per cent in Singapore by
the end of the decade, according to energy think-tank Ember
The problem with using renewables for AI
2025 Renewable Energy Industry Outlook, Deloitte 2025
• Energy supply from traditional renewables
tend to be intermittent
Battery storage accounted for the second-largest share of
total generating capacity additions, rising by 64% to 7.4 GW
(US Energy Information Administration, “Utilities report
batteries are most commonly used for arbitrage and grid
stability,” June 25, 2024)
• Energy demand profiles of AI are
• High and bursty for learning
• high and constant for serving
Key mismatch:
Key messages
• The Institute for Data and AI
• supporting data-intensive interdisciplinary research, inclusively and proactively
• AI driving energy demand
• Technical and business drivers for indefinite scaling
• Can AI run on renewable energy?
• AI driving optimisations in the renewable energy space
• The many ways that AI models are being deployed to support the uptake of
renewable energy
Can AI be part of the solution?
2025 Renewable Energy Industry Outlook, Deloitte 2025
AI for Energy Report, Argonne National Labs, April 2024
US DoE focus areas (2023-24!)
• Nuclear Power
• Power Grid
• Carbon Management
• Energy Storage
• Energy Materials
US: The energy community needs are largely aligned with
the six areas identified in the AI@DOE roundtable, as
follows:
energy efficient AI; intrinsically explainable AI; scientific
generative AI; safe, secure, and trustworthy AI; AI for
prevention, preparedness, and responding to national
emergencies; and AI for automation.
Horizon Europe reports on nearly €80 billion in major
projects funded by the EU from 2020 to 2027.
Where is AI playing a role?
Qiang Wang, Yuanfan Li, Rongrong Li, Integrating artificial intelligence in energy transition: A comprehensive review,
Energy Strategy Reviews, Vol 57, 2025, 101600, ISSN 2211-467X, https://doi.org/10.1016/j.esr.2024.101600
Managing the mismatch between demand and supply side:
- Supply side intermittency
- Demand side nonlinear
demand dispatch technology based on decentralized learning
successfully shifts up to 35 % of energy demand to periods of high wind
power availability, significantly saving electricity costs compared with
non-dispatch energy demand scenario
Are smart grids real?
Supply and demand matching and energy dispatch decision making
Do any of these solutions apply to the “running Data
Centres on renewables” problem??
AI for energy transition: a mindmap
Qiang Wang, Yuanfan Li, Rongrong Li, Integrating artificial intelligence in energy transition: A comprehensive review,
Energy Strategy Reviews, Vol 57, 2025, 101600, ISSN 2211-467X, https://doi.org/10.1016/j.esr.2024.101600
https://mm.tt/map/3809457666?t=zYUWwca0Qb
Energy research: intrinsically multidisciplinary
Health, Environment,
and Resilience
Creative and Cultural
Digital Futures
Digital
Governance,
Economy, and
Social Impact
Foundational Data Science and AI Methodologies
Advanced Engineering,
Materials Science, and
Smart Manufacturing
- energy transition
- sustainable
urban energy systems
- energy materials
- storage
technologies
- smart grids
- energy efficiency
- energy policy
- fair transitions
- energy poverty
- Demand forecasting
- grid optimisation
- predictive maintenance
- …

AI’s complicated relationship with energy

  • 1.
    AI’s complicated relationship withenergy Bad actor, good actor: the race to hyperscaling and the promise of AI for energy transition Paolo Missier Director, the Institute for Data and AI p.missier@bham.ac.uk
  • 2.
    Outline and keymessages The Institute for Data and AI • supporting data-intensive interdisciplinary research, inclusively and proactively Bad actor: AI driving energy demand • Technical and business drivers call for indefinite scaling • Can AI run on renewable energy? Good actor: AI driving optimisations in the renewable energy space • The many ways that AI models are being deployed to support the uptake of renewable energy
  • 3.
    The IDAI ishere to: Connect and engage both internally and externally Instigate new interdiscliplinary research collaborations - Grounded in Data, Data Science, and AI Encourage out-of-the-box thinking Ensure Data, Data Science and AI are co-developed and deployed competently Help researchers understand and respond to the challenges around Data and AI from the private, public, and third sector
  • 4.
    Challenge-driven interdisciplinary research Research challenges Funders priorities Researchthemes Institute for Data and AI Academic research / education ecosystem Emergin g Researc h Thematic Universit y Centres Thematic Collaborative Networks . Clear identity, focus, plan . Leadership . Focus on grant readiness IDAI Affiliates IDAI Fellows (Theme leads) Extended networks Research PostDocs Data Scientists Research Support team
  • 5.
    IDAI Affiliates Campaign2025 Total number of affiliates: 125 (slowing increasing) 6 are from the School of Engineering
  • 6.
    Top-level Research Themes Health,Environment, and Resilience - Us and our environment - Improving healthcare for all - Environmental sustainability - Ecological management - Personalised medicine - Public health monitoring - Environmental biosciences Creative and Cultural Digital Futures - Support and enhance human expression - AI for cultural practices - Creative industries - Transform humanities research - Critical cultural analysis - Artistic innovation - Creativity enriches technological progress Digital Governance, Economy, and Social Impact - Reshaping public institutions - Reshaping market systems - AI impact on society - Modernise state services - Enhance financial systems - Strengthen national security Foundational Data Science and AI Methodologies - Advancing core AI techniques - Data-centric AI - Data Engineering for AI - Develop robust, transparent - AI Ethics - Safe and Secure AI - AI for engineering - Materials research - Optimising manufacturing - Innovative design solutions - Next generation materials ‑ Advanced Engineering, Materials Science, and Smart Manufacturing
  • 7.
    Energy research: intrinsicallymultidisciplinary Health, Environment, and Resilience - Us and our environment - Improving healthcare for all - Environmental sustainability - Ecological management - Personalised medicine - Public health monitoring - Environmental biosciences Creative and Cultural Digital Futures - Support and enhance human expression - AI for cultural practices - Creative industries - Transform humanities research - Critical cultural analysis - Artistic innovation - Creativity enriches technological progress Digital Governance, Economy, and Social Impact - Reshaping public institutions - Reshaping market systems - AI impact on society - Modernise state services - Enhance financial systems - Strengthen national security Foundational Data Science and AI Methodologies - Advancing core AI techniques - Data-centric AI - Data Engineering for AI - Develop robust, transparent - AI Ethics - Safe and Secure AI - AI for engineering - Materials research - Optimising manufacturing - Innovative design solutions - Next generation materials ‑ Advanced Engineering, Materials Science, and Smart Manufacturing - energy transition - sustainable urban energy systems - energy materials - storage technologies - smart grids - energy efficiency - energy policy - fair transitions - energy poverty - Demand / supply forecasting - grid optimisation / stability - Smart demand response - Hybrid energy system (HES) optimization
  • 8.
    Key messages • TheInstitute for Data and AI • supporting data-intensive interdisciplinary research, inclusively and proactively • AI driving energy demand • Technical and business drivers for indefinite scaling • Can AI run on renewable energy? • AI driving optimisations in the renewable energy space • The many ways that AI models are being deployed to support the uptake of renewable energy
  • 9.
    How much energydoes AI consume? / Training [KCH+20] Kaplan, J., McCandlish, S., Henighan, T., Brown, T. B., Chess, B., Child, R., ... & Amodei, D. (2020). Scaling laws for neural language models. arXiv preprint arXiv:2001.08361. - Consistent trends spanning more than six orders of magnitude one PF-day = 1015 × 24 × 3600 = 8.64 × 1019 floating point operations.
  • 10.
    Large models vsdata size and compute time [KCH+20]
  • 11.
    The case forLarge models Smin is the minimum number of steps necessary to reach L Larger models converge better and faster Larger models make better use of a given training samples budget
  • 12.
    The incentives areall for scaling up - AI As a Service → Service Level competition → performance KPI “Taken together, these results show that language modeling performance improves smoothly and predictably as we appropriately scale up model size, data, and compute. We expect that larger language models will perform better and be more sample efficient than current models.” [KCH+20] Increase dataset size + increase model complexity + increase compute - AI As a Service → Subscription business model → scale up on users/requests (number and complexity of inference)
  • 13.
    How much energydoes AI consume? / Inference Why focus on energy consumption during inference? • Energy used to serve request vastly exceeds energy required for training • ChatGPT: 13 million daily visitors during Jan. 2023 → 400 million queries/ month → @3.96 Wh / request --> 1500 MWh / month • The energy consumed to serve request for one month (2023) is higher than the energy required to train GPT-3 [AP24] • Serving requests has a constantly high energy demand profile • In contrast, training can be deferred to times when energy is generated from a higher proportion of low-carbon or renewable sources [AP24] M. F. Argerich and M. Patiño-Martínez, "Measuring and Improving the Energy Efficiency of Large Language Models Inference," in IEEE Access, vol. 12, pp. 80194-80207, 2024, doi: 10.1109/ACCESS.2024.3409745.
  • 14.
    Energy consumption /token Source: [AP24]
  • 15.
    And including embodiedcarbon emissions… [NZD+24] • Operational carbon emissions (CO2eq) • Serving one prompt in ChatGPT generates more than 4 grams of CO2eq [Won23] – over 20 times the carbon emission of a web search query • Embodied carbon emissions owing to the manufacturing process of the hardware • The race to newer GPUs has a hidden cost! [NZD+24] Sophia Nguyen, Beihao Zhou, Yi Ding, and Sihang Liu. 2025. Towards Sustainable Large Language Model Serving. SIGENERGY Energy Inform. Rev. 4, 5 (December 2024), 134–140. https://doi.org/10.1145/3727200.3727220 [Won23] Vinnie Wong. Gen AI’s environmental ledger: A closer look at the carbon footprint of ChatGPT. https://piktochart.com/blog/carbon-footprint-of-chatgpt/, 2023. • Older GPUs may be more efficient for certain workloads (kinds of tasks) • Also sticking to older GPUs for longer helps amortize embodied carbon emissions • There is often a trade-off between throughput and energy efficiency
  • 16.
    Carbon cost ofAI inference 𝐶prompt,op = 𝐸prompt · 𝐶𝐼 Québec (QC), California Independent System Operator (CISO), PacifiCorp East (PACE). A higher fraction of renewable energy sources leads to a lower CI. Energy to respond to prompt Carbon Intensity of the grid (region-specific) Embodied Carbon Operational Carbon 𝐶prompt,em = 𝑡prompt / LT · 𝐶em LT: GPU lifetime (5 years) 𝑡prompt: time required to answer 1 prompt
  • 17.
    Carbon emissions perregion and GPU type Per-prompt carbon emission under the QC, CISO, and PACE grids (1B-parameter LLaMA).
  • 18.
    Google pushing back…(August 2025) We estimate the median Gemini Apps text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water — figures that are substantially lower than many public estimates. The per-prompt energy impact is equivalent to watching TV for less than nine seconds. Over a recent 12 month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively, all while delivering higher quality responses. https://cloud.google.com/blog/products/infrastructure/measuring-the-environmental-impact-of-ai-inference/ Measuring the environmental impact of delivering AI at Google Scale, Cooper Elsworth, Keguo Huang, David Patterson, Ian Schneider et al, August 2025 https://arxiv.org/abs/2508.15734
  • 19.
    Energy-efficient neural networktraining • (NAS): Neural Architecture Search • Automatically explore a space of alternative architecture optimising for efficiency • Network Pruning • Insight: neurons may have small saliency when their removal minimally affects the model output/loss. Removing those results in a sparse computational graph • Model distillation / Teacher-student model / transfer learning • Teacher model pre-trained on large data and on a generic task to facilitate learning on the target task • Quantization • Reduce the precision of the numerical representations used to store model weights and activations. Achieves compression, faster inference, and thus lower energy Gholami, A., Kim, S., Dong, Z., Yao, Z., Mahoney, M. W., & Keutzer, K. (2022). A survey of quantization methods for efficient neural network inference. In Low-power computer vision (pp. 291-326). Chapman and Hall/CRC.
  • 20.
    Quantization and energy/ latency trade-off Source: [AP24]
  • 21.
    The race ison… Nvidia said …Big Tech companies were adding tens of thousands of its latest GPUs to their systems every week NVIDIA’s “Rubin Ultra” systems will cram more than 500 GPUs into a single rack that will consume 600 kilowatts of power OpenAI envisages facilities that will go “way beyond” 10GW of power, requiring “new technologies and new construction”. Sasha Luccioni, AI and climate lead at open-source start-up Hugging Face, said alternative techniques to train AI models, such as distillation or the use of smaller models, were gaining popularity and could allow developers to build powerful models at a fraction of the cost.
  • 22.
    Data Centres -hyperscalers Small: 5,000–20,000 sq ft / 500-2,000 servers / 1–5MW. Average: 20,000-100,000 sq ft / 2,000-5,000 servers / 100MW. Hyperscale: > 5,000 servers / millions of sq ft / > 100MW Next tier: Alibaba Cloud Apple (8 DC in the US, Europe and China) IBM Cloud Meta: 21 DC, total space > 50M sq ft Oracle Cloud Infrastructure (OCI) Baidu Top tier: Amazon Web Services (AWS) Microsoft Azure Google Cloud Platform (GCP) https://www.ibm.com/think/topics/hyperscale-data-center (3/2024)
  • 23.
    The race forAI capacity Microsoft, Alphabet, Amazon and Meta capital expenditures plans >$300bn in 2025 Estimated total expenditure in 2025: $475bn (Gartner), up 42% on 2024 https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/ai-power-expanding-data-center-capacity-to-meet-growing-demand
  • 24.
    AI transforms datacentre design https://ig.ft.com/ai-data-centres/ (7/2025)
  • 25.
    Reshaping the datacentre footprints
  • 26.
    Power capacity ofthe new data centres Clustering --> latency reduction Crusoe is building eight data centre buildings, drawing a total of 1.2GW of capacity, in Abilene, Texas, for OpenAI. Abilene is part of OpenAI’s proposed $100bn Stargate project Meta is building a 2GW facility in Richland, Louisiana “Twenty years ago, a big data centre would have been 20 megawatt”, said Crusoe’s Lochmiller, referring to the amount of electricity required. “Today, a big data centre is a gigawatt or more . . . We believe we can build multi-gigawatt campuses.” xAI is targeting 1.2GW across several sites in Memphis, Tennessee. Amazon is developing a 2.2GW site for Anthropic in New Carlisle, Indiana.
  • 27.
  • 28.
    Energy usage forecasts (*)945 TWh is close to electricity consumption of Japan All hyperscalers have struck landmark deals for nuclear power in the past year. Microsoft is working with Constellation Energy to restart the Three Mile Island nuclear plant in Pennsylvania. Plus - Water consumption… - Cooling - … (see full article)
  • 29.
    What sources ofenergy are being used? https://www.ft.com/content/0f6111a8-0249-4a28-aef4-1854fc8b46f1
  • 30.
    Impact on nationalgrids Ireland is Europe’s capital for hyperscaler cloud data providers • Data centres account for more than 20% the country’s electricity demands (*) • More than half of power production in the grid comes from fossil fuels Hyperscalers report their energy use is on track to be, “100 per cent matched” with renewable energy at a global scale in 2025. However, these are clean energy credits --- not directly linked to data centre energy supply (*) Data centres will account for up to 30 per cent of power demand in Malaysia, 14 per cent in the Philippines and 12 per cent in Singapore by the end of the decade, according to energy think-tank Ember
  • 31.
    The problem withusing renewables for AI 2025 Renewable Energy Industry Outlook, Deloitte 2025 • Energy supply from traditional renewables tend to be intermittent Battery storage accounted for the second-largest share of total generating capacity additions, rising by 64% to 7.4 GW (US Energy Information Administration, “Utilities report batteries are most commonly used for arbitrage and grid stability,” June 25, 2024) • Energy demand profiles of AI are • High and bursty for learning • high and constant for serving Key mismatch:
  • 32.
    Key messages • TheInstitute for Data and AI • supporting data-intensive interdisciplinary research, inclusively and proactively • AI driving energy demand • Technical and business drivers for indefinite scaling • Can AI run on renewable energy? • AI driving optimisations in the renewable energy space • The many ways that AI models are being deployed to support the uptake of renewable energy
  • 33.
    Can AI bepart of the solution? 2025 Renewable Energy Industry Outlook, Deloitte 2025 AI for Energy Report, Argonne National Labs, April 2024 US DoE focus areas (2023-24!) • Nuclear Power • Power Grid • Carbon Management • Energy Storage • Energy Materials US: The energy community needs are largely aligned with the six areas identified in the AI@DOE roundtable, as follows: energy efficient AI; intrinsically explainable AI; scientific generative AI; safe, secure, and trustworthy AI; AI for prevention, preparedness, and responding to national emergencies; and AI for automation. Horizon Europe reports on nearly €80 billion in major projects funded by the EU from 2020 to 2027.
  • 34.
    Where is AIplaying a role? Qiang Wang, Yuanfan Li, Rongrong Li, Integrating artificial intelligence in energy transition: A comprehensive review, Energy Strategy Reviews, Vol 57, 2025, 101600, ISSN 2211-467X, https://doi.org/10.1016/j.esr.2024.101600 Managing the mismatch between demand and supply side: - Supply side intermittency - Demand side nonlinear demand dispatch technology based on decentralized learning successfully shifts up to 35 % of energy demand to periods of high wind power availability, significantly saving electricity costs compared with non-dispatch energy demand scenario
  • 35.
    Are smart gridsreal? Supply and demand matching and energy dispatch decision making Do any of these solutions apply to the “running Data Centres on renewables” problem??
  • 36.
    AI for energytransition: a mindmap Qiang Wang, Yuanfan Li, Rongrong Li, Integrating artificial intelligence in energy transition: A comprehensive review, Energy Strategy Reviews, Vol 57, 2025, 101600, ISSN 2211-467X, https://doi.org/10.1016/j.esr.2024.101600 https://mm.tt/map/3809457666?t=zYUWwca0Qb
  • 37.
    Energy research: intrinsicallymultidisciplinary Health, Environment, and Resilience Creative and Cultural Digital Futures Digital Governance, Economy, and Social Impact Foundational Data Science and AI Methodologies Advanced Engineering, Materials Science, and Smart Manufacturing - energy transition - sustainable urban energy systems - energy materials - storage technologies - smart grids - energy efficiency - energy policy - fair transitions - energy poverty - Demand forecasting - grid optimisation - predictive maintenance - …

Editor's Notes

  • #31 For a data centre in Dublin, for example, the credits claimed for its clean energy use can come from any generation at any time anywhere in Europe.