Successfully reported this slideshow.
Your SlideShare is downloading. ×

Evaluating global climate models using simple, explainable neural networks

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad

Check these out next

1 of 49 Ad

More Related Content

Slideshows for you (19)

Similar to Evaluating global climate models using simple, explainable neural networks (20)

Advertisement

More from Zachary Labe (20)

Recently uploaded (20)

Advertisement

Evaluating global climate models using simple, explainable neural networks

  1. 1. Evaluating global climate models using simple, explainable neural networks @ZLabe Zachary M. Labe with Elizabeth A. Barnes Colorado State University Department of Atmospheric Science 17 December 2021 NG51A-06 – AGU Fall Meeting Climate Variability Across Scales and Climate States and Neural Earth System Modeling [Oral Session I]
  2. 2. THE REAL WORLD (Observations) Map of temperature
  3. 3. THE REAL WORLD (Observations) Anomaly is relative to 1951-1980
  4. 4. THE REAL WORLD (Observations) CLIMATE MODEL ENSEMBLES Range of ensembles = internal variability (noise) Mean of ensembles = forced response (climate change)
  5. 5. Range of ensembles = internal variability (noise) Mean of ensembles = forced response (climate change) But let’s remove climate change…
  6. 6. Range of ensembles = internal variability (noise) Mean of ensembles = forced response (climate change) After removing the forced response… anomalies/noise!
  7. 7. 2-m Temperature (°C) THERE ARE MANY CLIMATE MODEL LARGE ENSEMBLES… Annual mean 2-m temperature 7 global climate models 16 ensembles each ERA5-BE (observations)
  8. 8. STANDARD EVALUATION OF CLIMATE MODELS Pattern correlation RMSE EOFs Trends, anomalies, mean state Climate modes of variability
  9. 9. STANDARD EVALUATION OF CLIMATE MODELS Pattern correlation RMSE EOFs Trends, anomalies, mean state Climate modes of variability CORRELATION [R]
  10. 10. STANDARD EVALUATION OF CLIMATE MODELS Pattern correlation RMSE EOFs Trends, anomalies, mean state Climate modes of variability CORRELATION [R]
  11. 11. STANDARD EVALUATION OF CLIMATE MODELS Pattern correlation RMSE EOFs Trends, anomalies, mean state Climate modes of variability Negative Correlation Positive Correlation PATTERN CORRELATION – T2M
  12. 12. INPUT [DATA] PREDICTION Machine Learning
  13. 13. ----ANN---- 2 Hidden Layers 10 Nodes each Ridge Regularization Early Stopping TEMPERATURE We know some metadata… + What year is it? + Where did it come from?
  14. 14. TEMPERATURE We know some metadata… + What year is it? + Where did it come from? Train on data from the Multi-Model Large Ensemble Archive
  15. 15. TEMPERATURE We know some metadata… + What year is it? + Where did it come from? NEURAL NETWORK CLASSIFICATION TASK HIDDEN LAYERS INPUT LAYER
  16. 16. CLIMATE MODEL MAP [DATA] Machine Learning CLASSIFICATION
  17. 17. CLASSIFICATION Machine Learning CLIMATE MODEL MAP [DATA]
  18. 18. CLASSIFICATION Machine Learning CLIMATE MODEL MAP [DATA] Explainable AI Learn new science!
  19. 19. LAYER-WISE RELEVANCE PROPAGATION (LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network WHY WHY WHY Backpropagation – LRP
  20. 20. LAYER-WISE RELEVANCE PROPAGATION (LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network WHY WHY WHY Backpropagation – LRP
  21. 21. LAYER-WISE RELEVANCE PROPAGATION (LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network WHY WHY WHY Backpropagation – LRP
  22. 22. LAYER-WISE RELEVANCE PROPAGATION (LRP) Image Classification LRP https://heatmapping.org/ NOT PERFECT Crock Pot Neural Network WHY Backpropagation – LRP
  23. 23. [Adapted from Adebayo et al., 2020] EXPLAINABLE AI IS NOT PERFECT THERE ARE MANY METHODS
  24. 24. [Adapted from Adebayo et al., 2020] THERE ARE MANY METHODS EXPLAINABLE AI IS NOT PERFECT
  25. 25. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low
  26. 26. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low
  27. 27. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low
  28. 28. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low
  29. 29. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low
  30. 30. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low
  31. 31. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low
  32. 32. COMPARING CLIMATE MODELS LRP (Explainable AI) Raw data (Difference from multi-model mean) Colder Warmer High Low EXPLAINABLE AI
  33. 33. What climate model does the neural network predict for each year of observations?
  34. 34. APPLYING METHODOLOGY TO REGIONS PREDICTION FOR EACH YEAR IN OBSERVATIONS PATTERN CORRELATIONS FOR EACH YEAR IN THE ARCTIC CORRELATION [R]
  35. 35. APPLYING METHODOLOGY TO REGIONS PREDICTION FOR EACH YEAR IN OBSERVATIONS TRENDS IN 2-M TEMPERATURE FROM 2005 TO 2019 Colder Warmer °C
  36. 36. APPLYING METHODOLOGY TO REGIONS PREDICTION FOR EACH YEAR IN OBSERVATIONS RAW DATA (DIFFERENCE FROM MULTI-MODEL MEAN) Colder Warmer °C
  37. 37. APPLYING METHODOLOGY TO REGIONS High Low RECENT ARCTIC AMPLIFICATION
  38. 38. APPLYING METHODOLOGY TO REGIONS High Low HISTORICAL PERIOD
  39. 39. APPLYING METHODOLOGY TO REGIONS High Low DIFFERENCE IN LAYER-WISE RELEVANCE PROPAGATION
  40. 40. APPLY SOFTMAX OPERATOR IN THE OUTPUT LAYER RANK
  41. 41. APPLY SOFTMAX OPERATOR IN THE OUTPUT LAYER [ 0.71 ] [ 0.05 ] [ 0.01 ] [ 0.01 ] [ 0.03 ] [ 0.11 ] [ 0.08 ] RANK
  42. 42. APPLY SOFTMAX OPERATOR IN THE OUTPUT LAYER [ 0.71 ] [ 0.05 ] [ 0.01 ] [ 0.01 ] [ 0.03 ] [ 0.11 ] [ 0.08 ] RANK [ 1 ] [ 4 ] [ 7 ] [ 6 ] [ 5 ] [ 2 ] [ 3 ]
  43. 43. APPLY SOFTMAX OPERATOR IN THE OUTPUT LAYER [ 0.71 ] [ 0.05 ] [ 0.01 ] [ 0.01 ] [ 0.03 ] [ 0.11 ] [ 0.08 ] RANK [ 1 ] [ 4 ] [ 7 ] [ 6 ] [ 5 ] [ 2 ] [ 3 ] Confidence/Probability
  44. 44. EVALUATING THE ANN’S CONFIDENCE Confidence for single ANN network from 1950 to 2019
  45. 45. EVALUATING THE ANN’S CONFIDENCE 100 ANNs: Combinations of training/testing/seeds
  46. 46. EVALUATING THE ANN’S CONFIDENCE
  47. 47. RANKING CLIMATE MODEL PREDICTIONS FOR EACH YEAR IN OBSERVATIONS
  48. 48. RANKING CLIMATE MODEL PREDICTIONS FOR EACH YEAR IN OBSERVATIONS
  49. 49. KEY POINTS Zachary Labe zmlabe@rams.colostate.edu @ZLabe 1. Explainable neural networks can be used to identify unique differences in temperature simulated between global climate model large ensembles 2. As a method of climate model evaluation, we input maps from observations into the neural network in order to classify each year with a climate model 3. The neural network architecture can be used in regions with known large biases, such as over the Arctic, or for different methods of preprocessing climate data

×