EXPLORING EXPLAINABLE MACHINE
LEARNING FOR DETECTING
CHANGES IN CLIMATE
https://zacklabe.com/ @ZLabe
Zachary M. Labe
NOAA GFDL and Princeton University; Atmospheric and Oceanic Sciences
9 February 2023
Florida State University
Department of Earth, Ocean, and Atmospheric Science
Machine Learning
is not new!
But…
Machine Learning
is not new!
Artificial Intelligence
Machine Learning
Deep Learning
Computer Science
Computer Science
Artificial Intelligence
Machine Learning
Deep Learning
Supervised
Learning
Unsupervised
Learning
Labeled data
Classification
Regression
Unlabeled data
Clustering
Dimension reduction
• Do it better
• e.g., parameterizations in climate models are not
perfect, use ML to make them more accurate
• Do it faster
• e.g., code in climate models is very slow (but we
know the right answer) - use ML methods to speed
things up
• Do something new
• e.g., go looking for non-linear relationships you
didn’t know were there
Very relevant for
research: may be
slower and worse,
but can still learn
something
WHY SHOULD WE CONSIDER
MACHINE LEARNING?
Machine learning for weather
IDENTIFYING SEVERE THUNDERSTORMS
Molina et al. 2021
Toms et al. 2021
CLASSIFYING PHASE OF MADDEN-JULLIAN OSCILLATION
SATELLITE DETECTION
Lee et al. 2021
DETECTING TORNADOES
McGovern et al. 2019
Machine learning for climate
FINDING FORECASTS OF OPPORTUNITY
Mayer and Barnes, 2021
PREDICTING CLIMATE MODES OF VARIABILITY
Gordon et al. 2021
TIMING OF CLIMATE CHANGE
Barnes et al. 2019
Machine learning for oceanography
CLASSIFYING ARCTIC OCEAN ACIDIFICATION
Krasting et al. 2022
TRACK AND REVEAL DEEP WATER MASSES
Sonnewald and Lguensat, 2021
ESTIMATING OCEAN SURFACE CURRENTS
Sinha and Abernathey, 2021
INPUT
[DATA]
PREDICTION
Machine
Learning
INPUT
[DATA]
PREDICTION
~Statistical
Algorithm~
INPUT
[DATA]
PREDICTION
Machine
Learning
Artificial Intelligence
Machine Learning
Deep Learning
X1
X2
INPUTS
Artificial Neural Networks [ANN]
Linear regression!
Artificial Neural Networks [ANN]
X1
X2
W1
W2
∑ = X1W1+ X2W2 + b
INPUTS
NODE
Artificial Neural Networks [ANN]
X1
X2
W1
W2
∑
INPUTS
NODE
Linear regression with non-linear
mapping by an “activation function”
Training of the network is merely
determining the weights “w” and
bias/offset “b"
= factivation(X1W1+ X2W2 + b)
Artificial Neural Networks [ANN]
X1
X2
W1
W2
∑
INPUTS
NODE
= factivation(X1W1+ X2W2 + b)
ReLU Sigmoid Linear
X1
X2
∑
inputs
HIDDEN LAYERS
X3
∑
∑
∑
OUTPUT
= predictions
Artificial Neural Networks [ANN]
: : ::
INPUTS
Complexity and nonlinearities of the ANN allow it to learn many
different pathways of predictable behavior
Once trained, you have an array of weights and biases which can be
used for prediction on new data
INPUT
[DATA]
PREDICTION
Artificial Neural Networks [ANN]
TEMPERATURE
TEMPERATURE
We know some metadata…
+ What year is it?
+ Where did it come from?
We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
TEMPERATURE
TEMPERATURE
Neural network learns nonlinear
combinations of forced climate
patterns to identify the year
We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
----ANN----
2 Hidden Layers
10 Nodes each
Ridge Regularization
Early Stopping
[e.g., Barnes et al. 2019, 2020]
[e.g., Labe and Barnes, 2021]
TIMING OF EMERGENCE
(COMBINED VARIABLES)
RESPONSES TO
EXTERNAL CLIMATE
FORCINGS
PATTERNS OF
CLIMATE INDICATORS
[e.g., Rader et al. 2022]
Surface Temperature Map Precipitation Map
+
TEMPERATURE
We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
----ANN----
2 Hidden Layers
10 Nodes each
Ridge Regularization
Early Stopping
[e.g., Barnes et al. 2019, 2020]
[e.g., Labe and Barnes, 2021]
TIMING OF EMERGENCE
(COMBINED VARIABLES)
RESPONSES TO
EXTERNAL CLIMATE
FORCINGS
PATTERNS OF
CLIMATE INDICATORS
Surface Temperature Map Precipitation Map
+
TEMPERATURE
[e.g., Rader et al. 2022]
We know some metadata…
+ What year is it?
+ Where did it come from?
[Labe and Barnes, 2022; ESS]
THE REAL WORLD
(Observations)
What is the annual mean temperature of Earth?
What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Anomaly is relative to 1951-1980
What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Let’s run a
climate model
What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Let’s run a
climate model
again
What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
Let’s run a
climate model
again & again
What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
CLIMATE MODEL
ENSEMBLES
What is the annual mean temperature of Earth?
THE REAL WORLD
(Observations)
CLIMATE MODEL
ENSEMBLES
Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
What is the annual mean temperature of Earth?
Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
But let’s remove
climate change…
What is the annual mean temperature of Earth?
Range of ensembles
= internal variability (noise)
Mean of ensembles
= forced response (climate change)
After removing the
forced response…
anomalies/noise!
What is the annual mean temperature of Earth?
• Increasing greenhouse gases (CO2, CH4, N2O)
• Changes in industrial aerosols (SO4, BC, OC)
• Changes in biomass burning (aerosols)
• Changes in land-use & land-cover (albedo)
What is the annual mean temperature of Earth?
• Increasing greenhouse gases (CO2, CH4, N2O)
• Changes in industrial aerosols (SO4, BC, OC)
• Changes in biomass burning (aerosols)
• Changes in land-use & land-cover (albedo)
Plus everything else…
(Natural/internal variability)
What is the annual mean temperature of Earth?
Greenhouse gases fixed to 1920 levels
All forcings (CESM-LE)
Industrial aerosols fixed to 1920 levels
[Deser et al. 2020, JCLI]
Fully-coupled CESM1.1
20 Ensemble Members
Run from 1920-2080
Observations
So what?
Greenhouse gases = warming
Aerosols = ?? (though mostly cooling)
What are the relative responses
between greenhouse gas
and aerosol forcing?
Surface Temperature Map
ARTIFICIAL NEURAL NETWORK (ANN)
INPUT LAYER
Surface Temperature Map
ARTIFICIAL NEURAL NETWORK (ANN)
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
ARTIFICIAL NEURAL NETWORK (ANN)
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
ARTIFICIAL NEURAL NETWORK (ANN)
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Layer-wise Relevance Propagation
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
ARTIFICIAL NEURAL NETWORK (ANN)
[Barnes et al. 2020, JAMES]
[Labe and Barnes 2021, JAMES]
LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
WHY
WHY
WHY
Backpropagation – LRP
LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
WHY
WHY
WHY
Backpropagation – LRP
LAYER-WISE RELEVANCE PROPAGATION (LRP)
Volcano
Great White
Shark
Timber
Wolf
Image Classification LRP
https://heatmapping.org/
LRP heatmaps show regions
of “relevance” that
contribute to the neural
network’s decision-making
process for a sample
belonging to a particular
output category
Neural Network
Backpropagation – LRP
WHY
WHY
WHY
LAYER-WISE RELEVANCE PROPAGATION (LRP)
Image Classification LRP
https://heatmapping.org/
NOT PERFECT
Crock
Pot
Neural Network
Backpropagation – LRP
WHY
[Adapted from Adebayo et al., 2020]
EXPLAINABLE AI IS
NOT PERFECT
THERE ARE MANY
METHODS
[Adapted from Adebayo et al., 2020]
THERE ARE MANY
METHODS
EXPLAINABLE AI IS
NOT PERFECT
Visualizing something we already know…
Neural
Network
[0] La Niña [1] El Niño
[Toms et al. 2020, JAMES]
Input a map of sea surface temperatures
Visualizing something we already know…
Input maps of sea surface
temperatures to identify
El Niño or La Niña
Use ‘LRP’ to see how the
neural network is making
its decision
[Toms et al. 2020, JAMES]
Layer-wise Relevance Propagation
Composite Observations
LRP [Relevance]
SST Anomaly [°C]
0.00 0.75
0.0 1.5
-1.5
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
Layer-wise Relevance Propagation
Surface Temperature Map
“2000-2009”
DECADE CLASS
“2070-2079”
“1920-1929”
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
ARTIFICIAL NEURAL NETWORK (ANN)
[Barnes et al. 2020, JAMES]
[Labe and Barnes 2021, JAMES]
1960-1999: ANNUAL MEAN TEMPERATURE TRENDS
Greenhouse gases fixed
to 1920 levels
[AEROSOLS PREVAIL]
Industrial aerosols fixed
to 1920 levels
[GREENHOUSE GASES PREVAIL]
All forcings
[STANDARD CESM-LE]
DATA
1960-1999: ANNUAL MEAN TEMPERATURE TRENDS
Greenhouse gases fixed
to 1920 levels
[AEROSOLS PREVAIL]
Industrial aerosols fixed
to 1920 levels
[GREENHOUSE GASES PREVAIL]
All forcings
[STANDARD CESM-LE]
DATA
1960-1999: ANNUAL MEAN TEMPERATURE TRENDS
Greenhouse gases fixed
to 1920 levels
[AEROSOLS PREVAIL]
Industrial aerosols fixed
to 1920 levels
[GREENHOUSE GASES PREVAIL]
All forcings
[STANDARD CESM-LE]
DATA
1960-1999: ANNUAL MEAN TEMPERATURE TRENDS
Greenhouse gases fixed
to 1920 levels
[AEROSOLS PREVAIL]
Industrial aerosols fixed
to 1920 levels
[GREENHOUSE GASES PREVAIL]
All forcings
[STANDARD CESM-LE]
DATA
CLIMATE MODEL DATA PREDICT THE YEAR FROM MAPS OF TEMPERATURE
AEROSOLS
PREVAIL
GREENHOUSE GASES
PREVAIL
STANDARD
CLIMATE MODEL
[Labe and Barnes 2021, JAMES]
OBSERVATIONS PREDICT THE YEAR FROM MAPS OF TEMPERATURE
AEROSOLS
PREVAIL
GREENHOUSE GASES
PREVAIL
STANDARD
CLIMATE MODEL
[Labe and Barnes 2021, JAMES]
OBSERVATIONS
SLOPES
PREDICT THE YEAR FROM MAPS OF TEMPERATURE
AEROSOLS
PREVAIL
GREENHOUSE GASES
PREVAIL
STANDARD
CLIMATE MODEL
[Labe and Barnes 2021, JAMES]
HOW DID THE ANN
MAKE ITS
PREDICTIONS?
HOW DID THE ANN
MAKE ITS
PREDICTIONS?
WHY IS THERE
GREATER SKILL
FOR GHG+?
RESULTS FROM LRP
[Labe and Barnes 2021, JAMES]
Low High
RESULTS FROM LRP
[Labe and Barnes 2021, JAMES]
Low High
RESULTS FROM LRP
[Labe and Barnes 2021, JAMES]
Low High
RESULTS FROM LRP
[Labe and Barnes 2021, JAMES]
Low High
Higher LRP values indicate greater relevance
for the ANN’s prediction
AVERAGED OVER 1960-2039
Aerosol-driven
Greenhouse gas-driven
All forcings
Low High
[Labe and Barnes 2021, JAMES]
Greenhouse gas-driven
Aerosol-driven
All forcings
AVERAGED OVER 1960-2039
[Labe and Barnes 2021, JAMES]
KEY POINTS FROM EXAMPLE #1
1. Using explainable AI methods with artificial neural networks (ANN)
reveals climate patterns in large ensemble simulations
2. A metric is proposed for quantifying the uncertainty of an ANN
visualization method that extracts signals from different external
forcings
3. Predictions from an ANN trained using a large ensemble without
time-evolving aerosols show the highest correlation with actual
observations
Labe, Z.M. and E.A. Barnes (2021), Detecting climate signals using explainable AI with single-forcing
large ensembles. Journal of Advances in Modeling Earth Systems, DOI:10.1029/2021MS002464
NASA/GISS/GISTEMPv4
“Hiatus”
Global Warming
Hiatus?
…in research
Global Warming
Hiatus?
…in research
Global Warming
Hiatus?
…in research
Global Warming
Hiatus?
…in research
Global Warming
Hiatus?
…in research
Global Warming
Hiatus?
…in research
Global Warming
Hiatus?
…in the media, etc.
Are slowdowns (“hiatus”) in decadal
warming predictable?
• Statistical construct?
• Lack of surface temperature observations in the Arctic?
• Phase transition of the Interdecadal Pacific Oscillation (IPO)?
• Influence of volcanoes and other aerosol forcing?
• Weaker solar forcing?
• Lower equilibrium climate sensitivity (ECS)?
• Other combinations of internal variability?
FUTURE
WARMING
Select one ensemble
member and calculate
the annual mean
global mean surface
temperature (GMST)
2-m TEMPERATURE
ANOMALY
[Labe and Barnes, 2022; GRL]
Calculate 10-year
moving (linear) trends
2-m TEMPERATURE
ANOMALY
[Labe and Barnes, 2022; GRL]
Plot the slope of the
linear trends
START OF 10-YEAR
TEMPERATURE TREND
2-m TEMPERATURE
ANOMALY
[Labe and Barnes, 2022; GRL]
Calculate a threshold
for defining a slowdown
in decadal warming
[Labe and Barnes, 2022; GRL]
Repeat this exercise for
each ensemble
member in CESM2-LE
[Labe and Barnes, 2022; GRL]
Compare warming
slowdowns with
reanalysis (ERA5)
[Labe and Barnes, 2022; GRL]
[Labe and Barnes, 2022; GRL]
OCEAN HEAT CONTENT – 100 M
Start with anomalous ocean heat…
[Labe and Barnes, 2022; GRL]
OCEAN HEAT CONTENT – 100 M
INPUT LAYER
Start with anomalous ocean heat…
[Labe and Barnes, 2022; GRL]
OCEAN HEAT CONTENT – 100 M
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
YES
SLOWDOWN
NO
SLOWDOWN
Will a slowdown begin?
[Labe and Barnes, 2022; GRL]
OCEAN HEAT CONTENT – 100 M
INPUT LAYER
HIDDEN LAYERS
OUTPUT LAYER
YES
SLOWDOWN
NO
SLOWDOWN
BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI
LAYER-WISE RELEVANCE PROPAGATION
Will a slowdown begin?
[Labe and Barnes, 2022; GRL]
So how well does the neural network do?
[Labe and Barnes, 2022; GRL]
Low High Colder Warmer
[Labe and Barnes, 2022; GRL]
Low High Colder Warmer
[Labe and Barnes, 2022; GRL]
Low High Colder Warmer
[Labe and Barnes, 2022; GRL]
What about observations?
Future (2012-)
so-called “hiatus”
Comparing
observations to
the IPO
[Labe and Barnes, 2022; GRL]
What about observations?
Future (2012-)
so-called “hiatus”
2021
Looking ahead
to the near-
future…
?
What about observations?
Colder Warmer
[2003, 2004] [2016, 2017]
[Labe and Barnes, 2022; GRL]
KEY POINTS FROM EXAMPLE #2
1.Artificial neural network predicts the onset of slowdowns in
decadal warming trends of global mean temperature
2.Explainable AI reveals the neural network is leveraging
tropical patterns of ocean heat content anomalies
3.Transitions in the phase of the Interdecadal Pacific Oscillation
are frequently associated with warming slowdown trends in
CESM2-LE
Labe, Z.M. and E.A. Barnes (2022), Predicting slowdowns in decadal climate warming trends with
explainable neural networks. Geophysical Research Letters, DOI:10.1029/2022GL098173
Earth is also warming
in the vertical!
Po-Chedley, S., J.T. Fasullo, N. Siler, Z.M. Labe, E.A. Barnes, C.J.W. Bonfils, and B.D. Santer (2022). Internal
variability and forcing influence model-satellite differences in the rate of tropical tropospheric
warming. Proceedings of the National Academy of Sciences, DOI:10.1073/pnas.2209431119
Adapted
from
Peings
et
al.
2018,
ERL
AA
UTW
LENS
Stratosphere
Troposphere
2100-2070
minus
1981-2010
Antarctic Equator Arctic
CLIMATE MODEL PROJECTION
OBSERVATIONS
https://www.realclimate.org/index.php/climate-model-projections-compared-to-observations/
Climate models exhibit 2x as much
warming as observations…
TMT = tropical mid-troposphere
[Po-Chedley
et
al.
2022,
PNAS]
PREDICT:
INTERNAL + EXTERNAL
COMPONENTS
[Po-Chedley et al. 2022, PNAS]
Partial least squares regression with CMIP6 large ensembles (test observations)
[Po-Chedley et al. 2022, PNAS]
UNDERSTANDING OUR
PREDICTIONS
–
Patterns of internal variability
(e.g., Interdecadal Pacific
Oscillation)
[Po-Chedley et al. 2022, PNAS]
INPUT
[DATA]
PREDICTION
Machine
Learning
Explainable AI
Learn new
science!
MACHINE LEARNING IS JUST
ANOTHER TOOL TO ADD TO OUR
WORKFLOW.
1)
MACHINE LEARNING IS
NO LONGER A BLACK BOX.
2)
WE CAN LEARN NEW SCIENCE
FROM EXPLAINABLE AI.
3)
KEY POINTS
1. Machine learning is just another tool to consider for our scientific workflow
2. We can use explainable AI (XAI) methods to peer into the black box of machine learning
3. We can learn new science by using XAI methods in conjunction with existing statistical tools
Zachary Labe
zachary.labe@noaa.gov
Labe, Z.M. and E.A. Barnes (2021), Detecting climate signals using explainable AI with single-forcing
large ensembles. Journal of Advances in Modeling Earth Systems, DOI:10.1029/2021MS002464
Labe, Z.M. and E.A. Barnes (2022), Predicting slowdowns in decadal climate warming trends with
explainable neural networks. Geophysical Research Letters, DOI:10.1029/2022GL098173
Po-Chedley, S., J.T. Fasullo, N. Siler, Z.M. Labe, E.A. Barnes, C.J.W. Bonfils, and B.D. Santer (2022). Internal
variability and forcing influence model-satellite differences in the rate of tropical tropospheric
warming. Proceedings of the National Academy of Sciences, DOI:10.1073/pnas.2209431119

Exploring explainable machine learning for detecting changes in climate

  • 1.
    EXPLORING EXPLAINABLE MACHINE LEARNINGFOR DETECTING CHANGES IN CLIMATE https://zacklabe.com/ @ZLabe Zachary M. Labe NOAA GFDL and Princeton University; Atmospheric and Oceanic Sciences 9 February 2023 Florida State University Department of Earth, Ocean, and Atmospheric Science
  • 2.
  • 3.
  • 4.
  • 5.
    Computer Science Artificial Intelligence MachineLearning Deep Learning Supervised Learning Unsupervised Learning Labeled data Classification Regression Unlabeled data Clustering Dimension reduction
  • 6.
    • Do itbetter • e.g., parameterizations in climate models are not perfect, use ML to make them more accurate • Do it faster • e.g., code in climate models is very slow (but we know the right answer) - use ML methods to speed things up • Do something new • e.g., go looking for non-linear relationships you didn’t know were there Very relevant for research: may be slower and worse, but can still learn something WHY SHOULD WE CONSIDER MACHINE LEARNING?
  • 7.
    Machine learning forweather IDENTIFYING SEVERE THUNDERSTORMS Molina et al. 2021 Toms et al. 2021 CLASSIFYING PHASE OF MADDEN-JULLIAN OSCILLATION SATELLITE DETECTION Lee et al. 2021 DETECTING TORNADOES McGovern et al. 2019
  • 8.
    Machine learning forclimate FINDING FORECASTS OF OPPORTUNITY Mayer and Barnes, 2021 PREDICTING CLIMATE MODES OF VARIABILITY Gordon et al. 2021 TIMING OF CLIMATE CHANGE Barnes et al. 2019
  • 9.
    Machine learning foroceanography CLASSIFYING ARCTIC OCEAN ACIDIFICATION Krasting et al. 2022 TRACK AND REVEAL DEEP WATER MASSES Sonnewald and Lguensat, 2021 ESTIMATING OCEAN SURFACE CURRENTS Sinha and Abernathey, 2021
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
    Linear regression! Artificial NeuralNetworks [ANN] X1 X2 W1 W2 ∑ = X1W1+ X2W2 + b INPUTS NODE
  • 16.
    Artificial Neural Networks[ANN] X1 X2 W1 W2 ∑ INPUTS NODE Linear regression with non-linear mapping by an “activation function” Training of the network is merely determining the weights “w” and bias/offset “b" = factivation(X1W1+ X2W2 + b)
  • 17.
    Artificial Neural Networks[ANN] X1 X2 W1 W2 ∑ INPUTS NODE = factivation(X1W1+ X2W2 + b) ReLU Sigmoid Linear
  • 18.
  • 19.
    Complexity and nonlinearitiesof the ANN allow it to learn many different pathways of predictable behavior Once trained, you have an array of weights and biases which can be used for prediction on new data INPUT [DATA] PREDICTION Artificial Neural Networks [ANN]
  • 20.
  • 21.
    TEMPERATURE We know somemetadata… + What year is it? + Where did it come from?
  • 22.
    We know somemetadata… + What year is it? + Where did it come from? [Labe and Barnes, 2022; ESS] TEMPERATURE
  • 23.
    TEMPERATURE Neural network learnsnonlinear combinations of forced climate patterns to identify the year We know some metadata… + What year is it? + Where did it come from? [Labe and Barnes, 2022; ESS]
  • 24.
    ----ANN---- 2 Hidden Layers 10Nodes each Ridge Regularization Early Stopping [e.g., Barnes et al. 2019, 2020] [e.g., Labe and Barnes, 2021] TIMING OF EMERGENCE (COMBINED VARIABLES) RESPONSES TO EXTERNAL CLIMATE FORCINGS PATTERNS OF CLIMATE INDICATORS [e.g., Rader et al. 2022] Surface Temperature Map Precipitation Map + TEMPERATURE We know some metadata… + What year is it? + Where did it come from? [Labe and Barnes, 2022; ESS]
  • 25.
    ----ANN---- 2 Hidden Layers 10Nodes each Ridge Regularization Early Stopping [e.g., Barnes et al. 2019, 2020] [e.g., Labe and Barnes, 2021] TIMING OF EMERGENCE (COMBINED VARIABLES) RESPONSES TO EXTERNAL CLIMATE FORCINGS PATTERNS OF CLIMATE INDICATORS Surface Temperature Map Precipitation Map + TEMPERATURE [e.g., Rader et al. 2022] We know some metadata… + What year is it? + Where did it come from? [Labe and Barnes, 2022; ESS]
  • 26.
    THE REAL WORLD (Observations) Whatis the annual mean temperature of Earth?
  • 27.
    What is theannual mean temperature of Earth? THE REAL WORLD (Observations) Anomaly is relative to 1951-1980
  • 28.
    What is theannual mean temperature of Earth? THE REAL WORLD (Observations) Let’s run a climate model
  • 29.
    What is theannual mean temperature of Earth? THE REAL WORLD (Observations) Let’s run a climate model again
  • 30.
    What is theannual mean temperature of Earth? THE REAL WORLD (Observations) Let’s run a climate model again & again
  • 31.
    What is theannual mean temperature of Earth? THE REAL WORLD (Observations) CLIMATE MODEL ENSEMBLES
  • 32.
    What is theannual mean temperature of Earth? THE REAL WORLD (Observations) CLIMATE MODEL ENSEMBLES Range of ensembles = internal variability (noise) Mean of ensembles = forced response (climate change)
  • 33.
    What is theannual mean temperature of Earth? Range of ensembles = internal variability (noise) Mean of ensembles = forced response (climate change) But let’s remove climate change…
  • 34.
    What is theannual mean temperature of Earth? Range of ensembles = internal variability (noise) Mean of ensembles = forced response (climate change) After removing the forced response… anomalies/noise!
  • 35.
    What is theannual mean temperature of Earth? • Increasing greenhouse gases (CO2, CH4, N2O) • Changes in industrial aerosols (SO4, BC, OC) • Changes in biomass burning (aerosols) • Changes in land-use & land-cover (albedo)
  • 36.
    What is theannual mean temperature of Earth? • Increasing greenhouse gases (CO2, CH4, N2O) • Changes in industrial aerosols (SO4, BC, OC) • Changes in biomass burning (aerosols) • Changes in land-use & land-cover (albedo) Plus everything else… (Natural/internal variability)
  • 37.
    What is theannual mean temperature of Earth?
  • 38.
    Greenhouse gases fixedto 1920 levels All forcings (CESM-LE) Industrial aerosols fixed to 1920 levels [Deser et al. 2020, JCLI] Fully-coupled CESM1.1 20 Ensemble Members Run from 1920-2080 Observations
  • 39.
    So what? Greenhouse gases= warming Aerosols = ?? (though mostly cooling) What are the relative responses between greenhouse gas and aerosol forcing?
  • 40.
  • 41.
    INPUT LAYER Surface TemperatureMap ARTIFICIAL NEURAL NETWORK (ANN)
  • 42.
    INPUT LAYER HIDDEN LAYERS OUTPUTLAYER Surface Temperature Map “2000-2009” DECADE CLASS “2070-2079” “1920-1929” ARTIFICIAL NEURAL NETWORK (ANN)
  • 43.
    INPUT LAYER HIDDEN LAYERS OUTPUTLAYER Surface Temperature Map “2000-2009” DECADE CLASS “2070-2079” “1920-1929” BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI ARTIFICIAL NEURAL NETWORK (ANN)
  • 44.
    INPUT LAYER HIDDEN LAYERS OUTPUTLAYER Layer-wise Relevance Propagation Surface Temperature Map “2000-2009” DECADE CLASS “2070-2079” “1920-1929” BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI ARTIFICIAL NEURAL NETWORK (ANN) [Barnes et al. 2020, JAMES] [Labe and Barnes 2021, JAMES]
  • 45.
    LAYER-WISE RELEVANCE PROPAGATION(LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network WHY WHY WHY Backpropagation – LRP
  • 46.
    LAYER-WISE RELEVANCE PROPAGATION(LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network WHY WHY WHY Backpropagation – LRP
  • 47.
    LAYER-WISE RELEVANCE PROPAGATION(LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network Backpropagation – LRP WHY WHY WHY
  • 48.
    LAYER-WISE RELEVANCE PROPAGATION(LRP) Image Classification LRP https://heatmapping.org/ NOT PERFECT Crock Pot Neural Network Backpropagation – LRP WHY
  • 49.
    [Adapted from Adebayoet al., 2020] EXPLAINABLE AI IS NOT PERFECT THERE ARE MANY METHODS
  • 50.
    [Adapted from Adebayoet al., 2020] THERE ARE MANY METHODS EXPLAINABLE AI IS NOT PERFECT
  • 51.
    Visualizing something wealready know…
  • 52.
    Neural Network [0] La Niña[1] El Niño [Toms et al. 2020, JAMES] Input a map of sea surface temperatures
  • 53.
    Visualizing something wealready know… Input maps of sea surface temperatures to identify El Niño or La Niña Use ‘LRP’ to see how the neural network is making its decision [Toms et al. 2020, JAMES] Layer-wise Relevance Propagation Composite Observations LRP [Relevance] SST Anomaly [°C] 0.00 0.75 0.0 1.5 -1.5
  • 54.
    INPUT LAYER HIDDEN LAYERS OUTPUTLAYER Layer-wise Relevance Propagation Surface Temperature Map “2000-2009” DECADE CLASS “2070-2079” “1920-1929” BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI ARTIFICIAL NEURAL NETWORK (ANN) [Barnes et al. 2020, JAMES] [Labe and Barnes 2021, JAMES]
  • 55.
    1960-1999: ANNUAL MEANTEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  • 56.
    1960-1999: ANNUAL MEANTEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  • 57.
    1960-1999: ANNUAL MEANTEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  • 58.
    1960-1999: ANNUAL MEANTEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  • 59.
    CLIMATE MODEL DATAPREDICT THE YEAR FROM MAPS OF TEMPERATURE AEROSOLS PREVAIL GREENHOUSE GASES PREVAIL STANDARD CLIMATE MODEL [Labe and Barnes 2021, JAMES]
  • 60.
    OBSERVATIONS PREDICT THEYEAR FROM MAPS OF TEMPERATURE AEROSOLS PREVAIL GREENHOUSE GASES PREVAIL STANDARD CLIMATE MODEL [Labe and Barnes 2021, JAMES]
  • 61.
    OBSERVATIONS SLOPES PREDICT THE YEARFROM MAPS OF TEMPERATURE AEROSOLS PREVAIL GREENHOUSE GASES PREVAIL STANDARD CLIMATE MODEL [Labe and Barnes 2021, JAMES]
  • 62.
    HOW DID THEANN MAKE ITS PREDICTIONS?
  • 63.
    HOW DID THEANN MAKE ITS PREDICTIONS? WHY IS THERE GREATER SKILL FOR GHG+?
  • 64.
    RESULTS FROM LRP [Labeand Barnes 2021, JAMES] Low High
  • 65.
    RESULTS FROM LRP [Labeand Barnes 2021, JAMES] Low High
  • 66.
    RESULTS FROM LRP [Labeand Barnes 2021, JAMES] Low High
  • 67.
    RESULTS FROM LRP [Labeand Barnes 2021, JAMES] Low High
  • 68.
    Higher LRP valuesindicate greater relevance for the ANN’s prediction AVERAGED OVER 1960-2039 Aerosol-driven Greenhouse gas-driven All forcings Low High [Labe and Barnes 2021, JAMES]
  • 69.
    Greenhouse gas-driven Aerosol-driven All forcings AVERAGEDOVER 1960-2039 [Labe and Barnes 2021, JAMES]
  • 70.
    KEY POINTS FROMEXAMPLE #1 1. Using explainable AI methods with artificial neural networks (ANN) reveals climate patterns in large ensemble simulations 2. A metric is proposed for quantifying the uncertainty of an ANN visualization method that extracts signals from different external forcings 3. Predictions from an ANN trained using a large ensemble without time-evolving aerosols show the highest correlation with actual observations Labe, Z.M. and E.A. Barnes (2021), Detecting climate signals using explainable AI with single-forcing large ensembles. Journal of Advances in Modeling Earth Systems, DOI:10.1029/2021MS002464
  • 73.
  • 74.
  • 75.
  • 76.
  • 77.
  • 78.
  • 79.
  • 80.
  • 81.
    Are slowdowns (“hiatus”)in decadal warming predictable? • Statistical construct? • Lack of surface temperature observations in the Arctic? • Phase transition of the Interdecadal Pacific Oscillation (IPO)? • Influence of volcanoes and other aerosol forcing? • Weaker solar forcing? • Lower equilibrium climate sensitivity (ECS)? • Other combinations of internal variability? FUTURE WARMING
  • 82.
    Select one ensemble memberand calculate the annual mean global mean surface temperature (GMST) 2-m TEMPERATURE ANOMALY [Labe and Barnes, 2022; GRL]
  • 83.
    Calculate 10-year moving (linear)trends 2-m TEMPERATURE ANOMALY [Labe and Barnes, 2022; GRL]
  • 84.
    Plot the slopeof the linear trends START OF 10-YEAR TEMPERATURE TREND 2-m TEMPERATURE ANOMALY [Labe and Barnes, 2022; GRL]
  • 85.
    Calculate a threshold fordefining a slowdown in decadal warming [Labe and Barnes, 2022; GRL]
  • 86.
    Repeat this exercisefor each ensemble member in CESM2-LE [Labe and Barnes, 2022; GRL]
  • 87.
    Compare warming slowdowns with reanalysis(ERA5) [Labe and Barnes, 2022; GRL]
  • 88.
  • 89.
    OCEAN HEAT CONTENT– 100 M Start with anomalous ocean heat… [Labe and Barnes, 2022; GRL]
  • 90.
    OCEAN HEAT CONTENT– 100 M INPUT LAYER Start with anomalous ocean heat… [Labe and Barnes, 2022; GRL]
  • 91.
    OCEAN HEAT CONTENT– 100 M INPUT LAYER HIDDEN LAYERS OUTPUT LAYER YES SLOWDOWN NO SLOWDOWN Will a slowdown begin? [Labe and Barnes, 2022; GRL]
  • 92.
    OCEAN HEAT CONTENT– 100 M INPUT LAYER HIDDEN LAYERS OUTPUT LAYER YES SLOWDOWN NO SLOWDOWN BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI LAYER-WISE RELEVANCE PROPAGATION Will a slowdown begin? [Labe and Barnes, 2022; GRL]
  • 93.
    So how welldoes the neural network do? [Labe and Barnes, 2022; GRL]
  • 94.
    Low High ColderWarmer [Labe and Barnes, 2022; GRL]
  • 95.
    Low High ColderWarmer [Labe and Barnes, 2022; GRL]
  • 96.
    Low High ColderWarmer [Labe and Barnes, 2022; GRL]
  • 97.
    What about observations? Future(2012-) so-called “hiatus” Comparing observations to the IPO [Labe and Barnes, 2022; GRL]
  • 98.
    What about observations? Future(2012-) so-called “hiatus” 2021 Looking ahead to the near- future… ?
  • 99.
    What about observations? ColderWarmer [2003, 2004] [2016, 2017] [Labe and Barnes, 2022; GRL]
  • 100.
    KEY POINTS FROMEXAMPLE #2 1.Artificial neural network predicts the onset of slowdowns in decadal warming trends of global mean temperature 2.Explainable AI reveals the neural network is leveraging tropical patterns of ocean heat content anomalies 3.Transitions in the phase of the Interdecadal Pacific Oscillation are frequently associated with warming slowdown trends in CESM2-LE Labe, Z.M. and E.A. Barnes (2022), Predicting slowdowns in decadal climate warming trends with explainable neural networks. Geophysical Research Letters, DOI:10.1029/2022GL098173
  • 101.
    Earth is alsowarming in the vertical! Po-Chedley, S., J.T. Fasullo, N. Siler, Z.M. Labe, E.A. Barnes, C.J.W. Bonfils, and B.D. Santer (2022). Internal variability and forcing influence model-satellite differences in the rate of tropical tropospheric warming. Proceedings of the National Academy of Sciences, DOI:10.1073/pnas.2209431119
  • 102.
  • 103.
  • 104.
  • 105.
  • 106.
    [Po-Chedley et al.2022, PNAS] Partial least squares regression with CMIP6 large ensembles (test observations)
  • 107.
    [Po-Chedley et al.2022, PNAS] UNDERSTANDING OUR PREDICTIONS – Patterns of internal variability (e.g., Interdecadal Pacific Oscillation)
  • 108.
    [Po-Chedley et al.2022, PNAS]
  • 109.
  • 110.
    MACHINE LEARNING ISJUST ANOTHER TOOL TO ADD TO OUR WORKFLOW. 1)
  • 111.
    MACHINE LEARNING IS NOLONGER A BLACK BOX. 2)
  • 112.
    WE CAN LEARNNEW SCIENCE FROM EXPLAINABLE AI. 3)
  • 113.
    KEY POINTS 1. Machinelearning is just another tool to consider for our scientific workflow 2. We can use explainable AI (XAI) methods to peer into the black box of machine learning 3. We can learn new science by using XAI methods in conjunction with existing statistical tools Zachary Labe zachary.labe@noaa.gov Labe, Z.M. and E.A. Barnes (2021), Detecting climate signals using explainable AI with single-forcing large ensembles. Journal of Advances in Modeling Earth Systems, DOI:10.1029/2021MS002464 Labe, Z.M. and E.A. Barnes (2022), Predicting slowdowns in decadal climate warming trends with explainable neural networks. Geophysical Research Letters, DOI:10.1029/2022GL098173 Po-Chedley, S., J.T. Fasullo, N. Siler, Z.M. Labe, E.A. Barnes, C.J.W. Bonfils, and B.D. Santer (2022). Internal variability and forcing influence model-satellite differences in the rate of tropical tropospheric warming. Proceedings of the National Academy of Sciences, DOI:10.1073/pnas.2209431119