22 February 2023…
Natural Sciences Group Seminar (Presentation): Using explainable machine learning for evaluating patterns of climate change, Washington State University Vancouver, USA. Remote Presentation.
References:
Labe, Z.M. and E.A. Barnes (2021), Detecting climate signals using explainable AI with single-forcing large ensembles. Journal of Advances in Modeling Earth Systems, DOI:10.1029/2021MS002464
Labe, Z.M. and E.A. Barnes (2022), Predicting slowdowns in decadal climate warming trends with explainable neural networks. Geophysical Research Letters, DOI:10.1029/2022GL098173
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Using explainable machine learning for evaluating patterns of climate change
1. USING EXPLAINABLE MACHINE
LEARNING FOR EVALUATING PATTERNS
OF CLIMATE CHANGE
https://zacklabe.com/ @ZLabe
Zachary M. Labe
NOAA GFDL and Princeton University; Atmospheric and Oceanic Science
with Dr. Elizabeth A. Barnes at Colorado State University
22 February 2023
Washington State University Vancouver
Natural Sciences Group
7. Do it better
e.g., parameterizations in climate models are not
perfect, use ML to make them more accurate
Do it faster
e.g., code in climate models is very slow (but we
know the right answer) - use ML methods to speed
things up
Do something new
• e.g., go looking for non-linear relationships you
didn’t know were there
Very relevant for
research: may be
slower and worse,
but can still learn
something
WHY SHOULD WE CONSIDER
MACHINE LEARNING?
10. Adapted from: Kotamarthi, R., Hayhoe, K., Mearns, L., Wuebbles, D., Jacobs, J., & Jurado, J.
(2021). Global Climate Models. In Downscaling Techniques for High-Resolution Climate
Projections: From Global Change to Local Impacts (pp. 19-39). Cambridge: Cambridge University
Press. doi:10.1017/9781108601269.003
CLIMATE MODELS
Horizontal Grid
Vertical Levels
Past/Present/Future
Fully-Coupled System
20-40 Petabytes of data
16. Machine learning for weather
IDENTIFYING SEVERE THUNDERSTORMS
Molina et al. 2021
Toms et al. 2021
CLASSIFYING PHASE OF MADDEN-JULLIAN OSCILLATION
SATELLITE DETECTION
Lee et al. 2021
DETECTING TORNADOES
McGovern et al. 2019
17. Machine learning for oceanography
CLASSIFYING ARCTIC OCEAN ACIDIFICATION
Krasting et al. 2022
TRACK AND REVEAL DEEP WATER MASSES
Sonnewald and Lguensat, 2021
ESTIMATING OCEAN SURFACE CURRENTS
Sinha and Abernathey, 2021
18. Machine learning for climate
FINDING FORECASTS OF OPPORTUNITY
Mayer and Barnes, 2021
PREDICTING CLIMATE MODES OF VARIABILITY
Gordon et al. 2021
TIMING OF CLIMATE CHANGE
Barnes et al. 2019
25. Artificial Neural Networks [ANN]
X1
X2
W1
W2
∑
INPUTS
NODE
Linear regression with non-linear
mapping by an “activation function”
Training of the network is merely
determining the weights “w” and
bias/offset “b"
= factivation(X1W1+ X2W2 + b)
26. Artificial Neural Networks [ANN]
X1
X2
W1
W2
∑
INPUTS
NODE
= factivation(X1W1+ X2W2 + b)
ReLU Sigmoid Linear
28. Complexity and nonlinearities of the ANN allow it to learn many
different pathways of predictable behavior
Once trained, you have an array of weights and biases which can be
used for prediction on new data
INPUT
[DATA]
PREDICTION
Artificial Neural Networks [ANN]
33. We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
TEMPERATURE
34. TEMPERATURE
Neural network learns nonlinear
combinations of forced climate
patterns to identify the year
We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
35. ----ANN----
2 Hidden Layers
10 Nodes each
Ridge Regularization
Early Stopping
[e.g., Barnes et al. 2019, 2020]
[e.g., Labe and Barnes, 2021]
TIMING OF EMERGENCE
(COMBINED VARIABLES)
RESPONSES TO
EXTERNAL CLIMATE
FORCINGS
PATTERNS OF
CLIMATE INDICATORS
[e.g., Rader et al. 2022]
Surface Temperature Map Precipitation Map
+
TEMPERATURE
We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
36. ----ANN----
2 Hidden Layers
10 Nodes each
Ridge Regularization
Early Stopping
[e.g., Barnes et al. 2019, 2020]
[e.g., Labe and Barnes, 2021]
TIMING OF EMERGENCE
(COMBINED VARIABLES)
RESPONSES TO
EXTERNAL CLIMATE
FORCINGS
PATTERNS OF
CLIMATE INDICATORS
Surface Temperature Map Precipitation Map
+
TEMPERATURE
[e.g., Rader et al. 2022]
We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
38. What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Anomaly is relative to 1951-1980
39. What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Let’s run a
climate model
40. What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Let’s run a
climate model
again
41. What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Let’s run a
climate model
again & again
42. What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
CLIMATE MODEL
ENSEMBLES
43. What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
CLIMATE MODEL
ENSEMBLES
Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
44. What is the annual mean temperature of Earth?
Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
But let’s remove
climate change…
45. What is the annual mean temperature of Earth?
Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
After removing the
forced response…
anomalies/noise!
46. What is the annual mean temperature of Earth?
• Increasing greenhouse gases (CO2, CH4, N2O)
• Changes in industrial aerosols (SO4, BC, OC)
• Changes in biomass burning (aerosols)
• Changes in land-use & land-cover (albedo)
47. What is the annual mean temperature of Earth?
• Increasing greenhouse gases (CO2, CH4, N2O)
• Changes in industrial aerosols (SO4, BC, OC)
• Changes in biomass burning (aerosols)
• Changes in land-use & land-cover (albedo)
Plus everything else…
(Natural/internal variability)
49. Greenhouse gases fixed to 1920 levels
All forcings (CESM-LE)
Industrial aerosols fixed to 1920 levels
[Deser et al. 2020, JCLI]
Fully-coupled CESM1.1
20 Ensemble Members
Run from 1920-2080
Observations
50. So what?
Greenhouse gases = warming
Aerosols = ?? (though mostly cooling)
What are the relative responses
between greenhouse gas
and aerosol forcing?
51. Surface Temperature Map
ARTIFICIAL NEURAL NETWORK (ANN)
Input data from one of the
three single forcing large
ensemble simulations
(AER+, GHG+, ALL)
53. INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
ARTIFICIAL NEURAL NETWORK (ANN)
54. INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
ARTIFICIAL NEURAL NETWORK (ANN)
55. INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Layer-wise Relevance Propagation
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
ARTIFICIAL NEURAL NETWORK (ANN)
[Barnes et al. 2020, JAMES]
[Labe and Barnes 2021, JAMES]
56. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
WHY
WHY
WHY
Backpropagation – LRP
57. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
WHY
WHY
WHY
Backpropagation – LRP
58. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
Backpropagation – LRP
WHY
WHY
WHY
59. LAYER-WISE RELEVANCE PROPAGATION (LRP)
Image Classification LRP
https://heatmapping.org/
NOT PERFECT
Crock
Pot
Neural Network
Backpropagation – LRP
WHY
60. [Adapted from Adebayo et al., 2020]
EXPLAINABLE AI (XAI) IS
NOT PERFECT
THERE ARE MANY
METHODS
A bird!
XAI
61. [Adapted from Adebayo et al., 2020]
THERE ARE MANY
METHODS
EXPLAINABLE AI (XAI) IS
NOT PERFECT
63. Neural
Network
[0] La Niña [1] El Niño
[Toms et al. 2020, JAMES]
Input a map of sea surface temperatures
64. Visualizing something we already know…
Input maps of sea surface
temperatures (SST) to
identify El Niño or La Niña
Use ‘LRP’ to see how the
neural network is making
its decision
[Toms et al. 2020, JAMES]
Layer-wise Relevance Propagation
Composite SST Observations
LRP [Relevance]
SST Anomaly [°C]
0.00 0.75
0.0 1.5
-1.5
65. INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Layer-wise Relevance Propagation
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
ARTIFICIAL NEURAL NETWORK (ANN)
[Barnes et al. 2020, JAMES]
[Labe and Barnes 2021, JAMES]
66. 1960-1999: ANNUAL MEAN TEMPERATURE TRENDS
Greenhouse gases fixed
to 1920 levels
[AEROSOLS PREVAIL]
Industrial aerosols fixed
to 1920 levels
[GREENHOUSE GASES PREVAIL]
All forcings
[STANDARD CESM-LE]
DATA
Warming
Cooling
67. 1960-1999: ANNUAL MEAN TEMPERATURE TRENDS
Greenhouse gases fixed
to 1920 levels
[AEROSOLS PREVAIL]
Industrial aerosols fixed
to 1920 levels
[GREENHOUSE GASES PREVAIL]
All forcings
[STANDARD CESM-LE]
DATA
Warming
Cooling
68. 1960-1999: ANNUAL MEAN TEMPERATURE TRENDS
Greenhouse gases fixed
to 1920 levels
[AEROSOLS PREVAIL]
Industrial aerosols fixed
to 1920 levels
[GREENHOUSE GASES PREVAIL]
All forcings
[STANDARD CESM-LE]
DATA
Warming
Cooling
69. CLIMATE MODEL DATA PREDICT THE YEAR FROM MAPS OF TEMPERATURE
AEROSOLS
PREVAIL
GREENHOUSE GASES
PREVAIL
STANDARD
CLIMATE MODEL
[Labe and Barnes 2021, JAMES]
70. OBSERVATIONS PREDICT THE YEAR FROM MAPS OF TEMPERATURE
AEROSOLS
PREVAIL
GREENHOUSE GASES
PREVAIL
STANDARD
CLIMATE MODEL
[Labe and Barnes 2021, JAMES]
71. OBSERVATIONS
SLOPES
PREDICT THE YEAR FROM MAPS OF TEMPERATURE
AEROSOLS
PREVAIL
GREENHOUSE GASES
PREVAIL
STANDARD
CLIMATE MODEL
[Labe and Barnes 2021, JAMES]
76. Higher LRP values indicate greater relevance
for the ANN’s prediction
AVERAGED OVER 1960-2039
Aerosol-driven
Greenhouse gas-driven
All forcings
Low High
[Labe and Barnes 2021, JAMES]
78. DISTRIBUTIONS OF LRP
AVERAGED OVER 1960-2039
[Labe and Barnes 2021, JAMES]
Distribution shifted right = more relevant region
79. KEY POINTS FROM EXAMPLE #1
1. Using explainable AI methods with artificial neural networks (ANN)
reveals climate patterns in large ensemble simulations
2. A metric is proposed for quantifying the uncertainty of an ANN
visualization method that extracts signals from different external
forcings
3. Predictions from an ANN trained using a large ensemble without
time-evolving aerosols show the highest correlation with actual
observations
Labe, Z.M. and E.A. Barnes (2021), Detecting climate signals using explainable AI with single-forcing
large ensembles. Journal of Advances in Modeling Earth Systems, DOI:10.1029/2021MS002464
90. Are slowdowns (“hiatus”) in decadal
warming predictable?
• Statistical construct?
• Lack of surface temperature observations in the Arctic?
• Phase transition of the Interdecadal Pacific Oscillation (IPO)?
• Influence of volcanoes and other aerosol forcing?
• Weaker solar forcing?
• Lower equilibrium climate sensitivity (ECS)?
• Other combinations of internal variability?
FUTURE
WARMING
91. Select one ensemble
member and calculate
the annual mean
global mean surface
temperature (GMST)
2-m TEMPERATURE
ANOMALY
[Labe and Barnes, 2022; GRL]
98. OCEAN HEAT CONTENT – 100 M
Start with anomalous ocean heat…
[Labe and Barnes, 2022; GRL]
99. OCEAN HEAT CONTENT – 100 M
INPUT LAYER
Start with anomalous ocean heat…
[Labe and Barnes, 2022; GRL]
100. OCEAN HEAT CONTENT – 100 M
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
YES
SLOWDOWN
NO
SLOWDOWN
Will a slowdown begin?
[Labe and Barnes, 2022; GRL]
101. OCEAN HEAT CONTENT – 100 M
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
YES
SLOWDOWN
NO
SLOWDOWN
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
LAYER-WISE RELEVANCE PROPAGATION
Will a slowdown begin?
[Labe and Barnes, 2022; GRL]
102. So how well does the neural network do?
[Labe and Barnes, 2022; GRL]
109. KEY POINTS FROM EXAMPLE #2
1.Artificial neural network predicts the onset of slowdowns in
decadal warming trends of global mean temperature
2.Explainable AI reveals the neural network is leveraging
tropical patterns of ocean heat content anomalies
3.Transitions in the phase of the Interdecadal Pacific Oscillation
are frequently associated with warming slowdown trends in
CESM2-LE
Labe, Z.M. and E.A. Barnes (2022), Predicting slowdowns in decadal climate warming trends with
explainable neural networks. Geophysical Research Letters, DOI:10.1029/2022GL098173
113. WE CAN LEARN NEW SCIENCE
FROM EXPLAINABLE AI.
3)
114. KEY POINTS
1. Machine learning is just another tool to consider for our scientific workflow
2. We can use explainable AI (XAI) methods to peer into the black box of machine learning
3. We can learn new science by using XAI methods in conjunction with existing statistical tools
Zachary Labe
zachary.labe@noaa.gov
Labe, Z.M. and E.A. Barnes (2021), Detecting climate signals using explainable AI with single-forcing
large ensembles. Journal of Advances in Modeling Earth Systems, DOI:10.1029/2021MS002464
Labe, Z.M. and E.A. Barnes (2022), Predicting slowdowns in decadal climate warming trends with
explainable neural networks. Geophysical Research Letters, DOI:10.1029/2022GL098173