Successfully reported this slideshow.
Your SlideShare is downloading. ×

Disentangling Climate Forcing in Multi-Model Large Ensembles Using Neural Networks

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 44 Ad

Disentangling Climate Forcing in Multi-Model Large Ensembles Using Neural Networks

Download to read offline

The relative roles of individual forcings on large-scale climate variability remain difficult to disentangle within fully-coupled global climate model simulations. Here, we train an artificial neural network (ANN) to classify the climate forcings of a new set of CESM1 initial-condition large ensembles that are forced by different combinations of aerosol (industrial and biomass burning), greenhouse gas, and land-use/land-cover forcings. As a result of learning the regional responses of internal variability to the different external forcings, the ANN is able to successfully classify the dominant forcing for each model simulation. Using recently developed explainable AI methods, such as layerwise relevance propagation, we then compare the patterns of climate variability identified by the ANN between different external climate forcings that are learned by the neural network. Further, we apply this ANN architecture on additional climate simulations from the multi-model large ensemble archive, which include all anthropogenic and natural radiative forcings. From this collection of initial-condition ensembles, the ANN is also able to detect changes in atmospheric internal variability between the 20th and 21st centuries by training on climate fields after the mean forced signal has already been removed. This ANN framework and its associated visualization tools provide a novel approach to extract complex patterns of observable and projected climate variability and trends in Earth system models. (from https://ams.confex.com/ams/101ANNUAL/meetingapp.cgi/Paper/379553)

The relative roles of individual forcings on large-scale climate variability remain difficult to disentangle within fully-coupled global climate model simulations. Here, we train an artificial neural network (ANN) to classify the climate forcings of a new set of CESM1 initial-condition large ensembles that are forced by different combinations of aerosol (industrial and biomass burning), greenhouse gas, and land-use/land-cover forcings. As a result of learning the regional responses of internal variability to the different external forcings, the ANN is able to successfully classify the dominant forcing for each model simulation. Using recently developed explainable AI methods, such as layerwise relevance propagation, we then compare the patterns of climate variability identified by the ANN between different external climate forcings that are learned by the neural network. Further, we apply this ANN architecture on additional climate simulations from the multi-model large ensemble archive, which include all anthropogenic and natural radiative forcings. From this collection of initial-condition ensembles, the ANN is also able to detect changes in atmospheric internal variability between the 20th and 21st centuries by training on climate fields after the mean forced signal has already been removed. This ANN framework and its associated visualization tools provide a novel approach to extract complex patterns of observable and projected climate variability and trends in Earth system models. (from https://ams.confex.com/ams/101ANNUAL/meetingapp.cgi/Paper/379553)

Advertisement
Advertisement

More Related Content

Slideshows for you (15)

Similar to Disentangling Climate Forcing in Multi-Model Large Ensembles Using Neural Networks (20)

Advertisement

More from Zachary Labe (19)

Recently uploaded (20)

Advertisement

Disentangling Climate Forcing in Multi-Model Large Ensembles Using Neural Networks

  1. 1. DISENTANGLING CLIMATE FORCING IN SINGLE-FORCING LARGE ENSEMBLES USING NEURAL NETWORKS @ZLabe Zachary M. Labe & Elizabeth A. Barnes Department of Atmospheric Science at Colorado State University 14 January 2021 20th Conference on Artificial Intelligence for Environmental Science 101st AMS Annual Meeting
  2. 2. THE REAL WORLD (Observations) What is the annual mean temperature of Earth?
  3. 3. What is the annual mean temperature of Earth? THE REAL WORLD (Observations) Anomaly is relative to 1951-1980
  4. 4. What is the annual mean temperature of Earth? THE REAL WORLD (Observations) Let’s run a climate model
  5. 5. What is the annual mean temperature of Earth? THE REAL WORLD (Observations) Let’s run a climate model again
  6. 6. What is the annual mean temperature of Earth? THE REAL WORLD (Observations) Let’s run a climate model again & again
  7. 7. What is the annual mean temperature of Earth? THE REAL WORLD (Observations) CLIMATE MODEL ENSEMBLES
  8. 8. What is the annual mean temperature of Earth? THE REAL WORLD (Observations) Range of ensembles = internal variability (noise) Mean of ensembles = forced response (climate change) CLIMATE MODEL ENSEMBLES
  9. 9. What is the annual mean temperature of Earth? • Increasing greenhouse gases (CO2, CH4, N2O) • Changes in industrial aerosols (SO4, BC, OC) • Changes in biomass burning (aerosols) • Changes in land-use & land-cover (albedo)
  10. 10. What is the annual mean temperature of Earth? • Increasing greenhouse gases (CO2, CH4, N2O) • Changes in industrial aerosols (SO4, BC, OC) • Changes in biomass burning (aerosols) • Changes in land-use & land-cover (albedo) Plus everything else… (Natural/internal variability)
  11. 11. What is the annual mean temperature of Earth? • Increasing greenhouse gases (CO2, CH4, N2O) • Changes in industrial aerosols (SO4, BC, OC) • Changes in biomass burning (aerosols) • Changes in land-use & land-cover (albedo) ALL
  12. 12. What is the annual mean temperature of Earth?
  13. 13. Greenhouse gases fixed to 1920 levels All forcings (CESM-LE) Industrial aerosols fixed to 1920 levels [Deser et al. 2020, JCLI] Fully-coupled CESM1.1 20 Ensemble Members Run from 1920-2080 Reanalysis
  14. 14. So what? Greenhouse gases = warming Aerosols = ?? (though mostly cooling) What are the relative responses between greenhouse gas and aerosol forcing?
  15. 15. Surface Temperature Map ARTIFICIAL NEURAL NETWORK (ANN)
  16. 16. INPUT LAYERSurface Temperature Map ARTIFICIAL NEURAL NETWORK (ANN)
  17. 17. INPUT LAYER HIDDEN LAYERS OUTPUT LAYER Surface Temperature Map “2000-2009” DECADE CLASS “2070-2079” “1920-1929” ARTIFICIAL NEURAL NETWORK (ANN)
  18. 18. INPUT LAYER HIDDEN LAYERS OUTPUT LAYER Surface Temperature Map “2000-2009” DECADE CLASS “2070-2079” “1920-1929” BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI ARTIFICIAL NEURAL NETWORK (ANN)
  19. 19. INPUT LAYER HIDDEN LAYERS OUTPUT LAYER Layer-wise Relevance Propagation Surface Temperature Map “2000-2009” DECADE CLASS “2070-2079” “1920-1929” BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI ARTIFICIAL NEURAL NETWORK (ANN) [Barnes et al. 2020, JAMES] [Labe and Barnes 2021, submitted]
  20. 20. LAYER-WISE RELEVANCE PROPAGATION (LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ [Geoscience examples in Toms et al. 2020, JAMES] LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network WHYWHYWHY Backpropagation – LRP
  21. 21. LAYER-WISE RELEVANCE PROPAGATION (LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ [Geoscience examples in Toms et al. 2020, JAMES] LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network WHYWHYWHY Backpropagation – LRP
  22. 22. LAYER-WISE RELEVANCE PROPAGATION (LRP) Volcano Great White Shark Timber Wolf Image Classification LRP https://heatmapping.org/ [Geoscience examples in Toms et al. 2020, JAMES] LRP heatmaps show regions of “relevance” that contribute to the neural network’s decision-making process for a sample belonging to a particular output category Neural Network Backpropagation – LRP WHYWHYWHY
  23. 23. LAYER-WISE RELEVANCE PROPAGATION (LRP) Image Classification LRP https://heatmapping.org/ [Geoscience examples in Toms et al. 2020, JAMES] NOT PERFECTCrock Pot Neural Network Backpropagation – LRP WHY
  24. 24. OUTPUT LAYER Layer-wise Relevance Propagation “2000-2009” DECADE CLASS “2070-2079” “1920-1929” BACK-PROPAGATE THROUGH NETWORK = EXPLAINABLE AI [Labe and Barnes 2021, submitted] WHY?= LRP HEAT MAPS
  25. 25. 1960-1999: ANNUAL MEAN TEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  26. 26. 1960-1999: ANNUAL MEAN TEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  27. 27. 1960-1999: ANNUAL MEAN TEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  28. 28. 1960-1999: ANNUAL MEAN TEMPERATURE TRENDS Greenhouse gases fixed to 1920 levels [AEROSOLS PREVAIL] Industrial aerosols fixed to 1920 levels [GREENHOUSE GASES PREVAIL] All forcings [STANDARD CESM-LE] DATA
  29. 29. CLIMATE MODEL DATA PREDICT THE YEAR FROM MAPS OF TEMPERATURE [Labe and Barnes 2021, submitted]
  30. 30. OBSERVATIONS PREDICT THE YEAR FROM MAPS OF TEMPERATURE [Labe and Barnes 2021, submitted]
  31. 31. OBSERVATIONS SLOPES PREDICT THE YEAR FROM MAPS OF TEMPERATURE [Labe and Barnes 2021, submitted]
  32. 32. [LabeandBarnes2021,submitted] ARE THE RESULTS ROBUST? YES! COMBINATIONS OF TRAINING/TESTING DATA
  33. 33. HOW DID THE ANN MAKE ITS PREDICTIONS?
  34. 34. HOW DID THE ANN MAKE ITS PREDICTIONS? WHY IS THERE GREATER SKILL FOR GHG+?
  35. 35. RESULTS FROM LRP [LabeandBarnes2021,submitted]
  36. 36. RESULTS FROM LRP [LabeandBarnes2021,submitted] WHAT IS SIGNIFICANT?
  37. 37. 1. Shuffle ensemble member and year dimensions (bootstrap-like method) 2. Apply true labels (unshuffled years) 3. Apply same ANN architecture and LRP 4. Repeat 500x by using different combinations of training/testing data and initialization seeds 5. Compute 95th percentile of the distribution of LRP at all grid points [Labe and Barnes 2021, submitted] Uncertainty in LRP
  38. 38. Uncertainty in LRP Ultimately, we are trying to mask noise in the LRP output Identify robust climate pattern indicators! [Labe and Barnes 2021, submitted]
  39. 39. RESULTS FROM LRP [LabeandBarnes2021,submitted]
  40. 40. RESULTS FROM LRP [LabeandBarnes2021,submitted]
  41. 41. [Labe and Barnes 2021, submitted] Higher LRP values indicate greater relevance for the ANN’s prediction AVERAGED OVER 1960-2039
  42. 42. AVERAGED OVER 1960-2039 [Labe and Barnes 2021, submitted]
  43. 43. DISTRIBUTIONS OF LRP [Labe and Barnes 2021, submitted] AVERAGED OVER 1960-2039
  44. 44. KEY POINTS Zachary Labe zmlabe@rams.colostate.edu @ZLabe 1. Using explainable AI methods with artificial neural networks (ANNs) reveals climate patterns in large ensemble simulations 2. Metric proposed for quantifying the uncertainty of an ANN visualization method that extracts signals from different external forcings 3. ANN trained using a large ensemble simulation without time-evolving aerosols makes more accurate predictions of real world data

×