Program on Mathematical and Statistical Methods for Climate and the Earth System Opening Workshop, The Effects of Climate Change on Human Health through Changes in Air Quality - Jason West, Aug 23, 2017
Climate change affects human health in several different ways, and one important effect is through changes in air pollution. Here I will discuss the state of science currently on how climate change affects air pollution, and the resulting effects on human health, drawing from the broad literature and highlighting studies from my lab. In particular, I will discuss: 1) The effects of air pollution on health globally, 2) How climate change affects air pollution, 3) The effects of climate change on air pollution and health globally, and 4) The co-benefits of greenhouse gas mitigation for air pollution and health, globally and in the US.
Global climate change affects human health most notably by increasing the frequency and intensity of dangerous heat waves, wildfires and hurricanes. In addition to extreme weather events, climate change can also lead to a myriad of persistent environmental changes that impact public health. Health impact assessment refers to the analytic framework for evaluating how a policy or program affects population health. It is frequently applied in climate and public health research to quantify future health and economic burdens attributable to various consequences of climate change.
Performing health impact assessment entails the integration of various data. For projecting future climate-related health impacts, analyses require three sources of information: (1) health effects of environmental exposures, (2) projections of future exposures, and (3) distributions of exposures and effects in the future population. Each information source is subject to uncertainty because of data availability and assumptions made for the future. Climate research is highly interdisciplinary, bringing together tremendous amount of data, theory, and modeling efforts to provide timely knowledge for one of the most pressing issues of our time. Statistical modeling techniques and probabilistic reasoning can plan an important role in ensuring these findings are informative, accurate, and reproducible.
This presentation will discuss recent development in statistical methods for quantifying health impacts of climate change, as well as related open problems in environmental epidemiology and exposure assessment.
Evidence suggests that exposure to elevated concentrations of air pollution during pregnancy may increase risks of birth defects and other adverse birth outcomes. While current regulations put limits on total PM2.5 concentrations, there are many speciated pollutants within this size class that likely have distinct effects on perinatal health. However, due to correlations between these speciated pollutants it can be difficult to decipher their effects in a model for birth outcomes. To combat this difficulty we develop a new multivariate spatio-temporal Bayesian model for speciated particulate matter using dynamic spatial factors. These spatial factors can then be interpolated to the pregnant women’s homes to be used in a birth outcomes model. The model for birth outcomes allows the impact of pollutants to vary across different weeks of the pregnancy in order to identify susceptible periods. The proposed innovative methodology is implemented using pollutant monitoring data from the Environmental Protection Agency and birth records from the National Birth Defect Prevention Study.
Work in collaboration with Kimberly Kaufeld, Brian Reich, Amy Herring, Gary Shaw and Maria Terres.
A presentation made at the 2015 NC BREATHE Conference by Jason West, PhD of University of North Carolina - Chapel Hill. Sponsored by Clean Air Carolina and partners, the 2015 NC BREATHE Conference was held on March 27, 2015 in Raleigh, NC to bring together air quality researchers, medical and public health professionals, and policymakers to share the latest research on the health impacts of air pollution, the positive health outcomes related to clean air policy-making, and the resulting economic benefits.
Global climate change affects human health most notably by increasing the frequency and intensity of dangerous heat waves, wildfires and hurricanes. In addition to extreme weather events, climate change can also lead to a myriad of persistent environmental changes that impact public health. Health impact assessment refers to the analytic framework for evaluating how a policy or program affects population health. It is frequently applied in climate and public health research to quantify future health and economic burdens attributable to various consequences of climate change.
Performing health impact assessment entails the integration of various data. For projecting future climate-related health impacts, analyses require three sources of information: (1) health effects of environmental exposures, (2) projections of future exposures, and (3) distributions of exposures and effects in the future population. Each information source is subject to uncertainty because of data availability and assumptions made for the future. Climate research is highly interdisciplinary, bringing together tremendous amount of data, theory, and modeling efforts to provide timely knowledge for one of the most pressing issues of our time. Statistical modeling techniques and probabilistic reasoning can plan an important role in ensuring these findings are informative, accurate, and reproducible.
This presentation will discuss recent development in statistical methods for quantifying health impacts of climate change, as well as related open problems in environmental epidemiology and exposure assessment.
Evidence suggests that exposure to elevated concentrations of air pollution during pregnancy may increase risks of birth defects and other adverse birth outcomes. While current regulations put limits on total PM2.5 concentrations, there are many speciated pollutants within this size class that likely have distinct effects on perinatal health. However, due to correlations between these speciated pollutants it can be difficult to decipher their effects in a model for birth outcomes. To combat this difficulty we develop a new multivariate spatio-temporal Bayesian model for speciated particulate matter using dynamic spatial factors. These spatial factors can then be interpolated to the pregnant women’s homes to be used in a birth outcomes model. The model for birth outcomes allows the impact of pollutants to vary across different weeks of the pregnancy in order to identify susceptible periods. The proposed innovative methodology is implemented using pollutant monitoring data from the Environmental Protection Agency and birth records from the National Birth Defect Prevention Study.
Work in collaboration with Kimberly Kaufeld, Brian Reich, Amy Herring, Gary Shaw and Maria Terres.
A presentation made at the 2015 NC BREATHE Conference by Jason West, PhD of University of North Carolina - Chapel Hill. Sponsored by Clean Air Carolina and partners, the 2015 NC BREATHE Conference was held on March 27, 2015 in Raleigh, NC to bring together air quality researchers, medical and public health professionals, and policymakers to share the latest research on the health impacts of air pollution, the positive health outcomes related to clean air policy-making, and the resulting economic benefits.
Climate change could have far-reaching consequences for human health across the 21st century. At the same time, development choices will alter underlying vulnerability to these risks, affecting the magnitude and pattern of impacts. The current and projected human health risks of climate change are diverse and wide-ranging, potentially altering the burden of any health outcome sensitive to weather or climate. Climate variability and change can affect morbidity and mortality from extreme weather and climate events, and from changes in air quality arising from changing concentrations of ozone, particulate matter, or aeroallergens. Altering weather patterns and sea level rise also may facilitate changes in the geographic range, seasonality, and incidence of selected infectious diseases in some regions, such as malaria moving into highland areas in parts of sub-Saharan Africa. Changes in water availability and agricultural productivity could affect undernutrition, particularly in parts of Asia and Africa. These risks are not independent, but will interact in complex ways with risks in other sectors. Policies and programs need to explicitly take climate change into account to facilitate sustainable and resilient societies that effectively prepare for, manage, and recover from climate-related hazards.
Estimating the Probability of Earthquake Occurrence and Return Period Using G...sajjalp
In this paper, the frequency of an earthquake occurrence and magnitude relationship
has been modeled with generalized linear models for the set of
earthquake data of Nepal. A goodness of fit of a statistical model is applied for
generalized linear models and considering the model selection information
criterion, Akaike information criterion and Bayesian information criterion,
generalized Poisson regression model has been selected as a suitable model
for the study. The objective of this study is to determine the parameters (a
and b values), estimate the probability of an earthquake occurrence and its
return period using a Poisson regression model and compared with the Gutenberg-Richter
model. The study suggests that the probabilities of earthquake
occurrences and return periods estimated by both the models are relatively
close to each other. The return periods from the generalized Poisson
regression model are comparatively smaller than the Gutenberg-Richter
model.
Particulate matter from open lot dairies and cattle feeding: recent developmentsLPE Learning Center
The full proceedings paper is at: http://www.extension.org/72731
The research community is making good progress in understanding the mechanical, biochemical, and atmospheric processes that are responsible for airborne emissions of particulate matter (PM, or dust) from open-lot livestock production, especially dairies and cattle feedyards. Recent studies in Texas, Kansas, Nebraska, Colorado, California, and Australia have expanded the available data on both emission rates and abatement measures. Although the uncertainties associated with our estimates of fugitive emissions are still unacceptably high, we have learned from our recent experience with ammonia that using a wide variety of credible measurement techniques, rather than focusing on one so-called “standard” technique, may be the better way to improve confidence in our estimates. Whereas the most promising control measures for gaseous emissions continue to be dietary strategies with management of corral-surface moisture a close second for particulate matter, corral-surface management and moisture management play comparable roles, depending on the mechanical strength of soils and the availability of water, respectively. The cost per unit reduction of emitted mass attributable to these abatement measures varies as widely as the emissions estimates themselves, so we need to intensify our emphasis on process-based emissions research to (a) reduce the variances in our emissions estimates and (b) mitigate the contingency of prior, empirically based estimates. As a general rule, although cattle feedyard emission factors may be thought a reasonable starting point for estimating emissions from open-lot dairies, such estimates should be viewed with suspicion.
Air Pollution and Multiple Sclerosis: An Ecological StudyDataNB
The objective of this research is to explore the link between air pollution and multiple sclerosis (MS). MS is a chronic progressive neurological disease of young adulthood that results in significant physical and cognitive disability. Rates of MS in NB are among the highest in the world. Greater exposure to air pollution has been previously implicated as risk factor and basic science studies demonstrate that pollutants can cross the blood brain barrier. We previously conducted a prevalence study and identified regional variation in MS prevalence in NB. To explore this geographic variability, we compared air pollution levels and MS prevalence using data housed at NB-IRDT. We stratified MS cases by geography, into one of the thirty-three Health Council Communities (HCCs), and assigned long-term air pollution levels (i.e. particulate matter <2.5µm (PM2.5), nitrogen dioxide (NO2), sulphur dioxide (SO2) and ozone (O3)), from the Canadian Urban Environmental Health Research Consortium (CANUE) to each HCC. Average pollutant levels were all below established Canadian air quality standards. We found PM2.5 was positively associated with MS prevalence. Our results offer additional evidence for a link between ambient PM2.5 and MS, even in areas with low air pollution levels such as NB.
L’aria è elemento essenziale per la vita dell’uomo.La “mission” di questo blog è quello di soddisfare le esigenze di ricerca e di conoscenza delle tecnologie che possono permettere alle persone di respirare ogni giorno un’aria più pulita e sana, migliorando la qualità e la durata della loro vita.
Species distribution models (SDMs) are increasingly used to address
numerous questions in ecology, biogeography, conservation biology and evolution.
Surprisingly, the crucial step of selecting the most relevant variables has
received little attention, despite its direct implications for model transferability
and uncertainty. Here, we aim to address this with a continent-wide, evaluation
of which climate predictors provided the most accurate SDMs for bird
distributions.
A typical problem in environmental epidemiological studies concerns environmental exposure assessment. In this talk, we will discuss challenges to environmental exposure assessment and we will showcase and discuss statistical methods that have been developed to obtain estimates of environmental exposure (e.g. air pollution, temperature). Further we will discuss whether and how uncertainty in the environmental exposure has been and can be incorporated in health analyses.
Climate change could have far-reaching consequences for human health across the 21st century. At the same time, development choices will alter underlying vulnerability to these risks, affecting the magnitude and pattern of impacts. The current and projected human health risks of climate change are diverse and wide-ranging, potentially altering the burden of any health outcome sensitive to weather or climate. Climate variability and change can affect morbidity and mortality from extreme weather and climate events, and from changes in air quality arising from changing concentrations of ozone, particulate matter, or aeroallergens. Altering weather patterns and sea level rise also may facilitate changes in the geographic range, seasonality, and incidence of selected infectious diseases in some regions, such as malaria moving into highland areas in parts of sub-Saharan Africa. Changes in water availability and agricultural productivity could affect undernutrition, particularly in parts of Asia and Africa. These risks are not independent, but will interact in complex ways with risks in other sectors. Policies and programs need to explicitly take climate change into account to facilitate sustainable and resilient societies that effectively prepare for, manage, and recover from climate-related hazards.
Estimating the Probability of Earthquake Occurrence and Return Period Using G...sajjalp
In this paper, the frequency of an earthquake occurrence and magnitude relationship
has been modeled with generalized linear models for the set of
earthquake data of Nepal. A goodness of fit of a statistical model is applied for
generalized linear models and considering the model selection information
criterion, Akaike information criterion and Bayesian information criterion,
generalized Poisson regression model has been selected as a suitable model
for the study. The objective of this study is to determine the parameters (a
and b values), estimate the probability of an earthquake occurrence and its
return period using a Poisson regression model and compared with the Gutenberg-Richter
model. The study suggests that the probabilities of earthquake
occurrences and return periods estimated by both the models are relatively
close to each other. The return periods from the generalized Poisson
regression model are comparatively smaller than the Gutenberg-Richter
model.
Particulate matter from open lot dairies and cattle feeding: recent developmentsLPE Learning Center
The full proceedings paper is at: http://www.extension.org/72731
The research community is making good progress in understanding the mechanical, biochemical, and atmospheric processes that are responsible for airborne emissions of particulate matter (PM, or dust) from open-lot livestock production, especially dairies and cattle feedyards. Recent studies in Texas, Kansas, Nebraska, Colorado, California, and Australia have expanded the available data on both emission rates and abatement measures. Although the uncertainties associated with our estimates of fugitive emissions are still unacceptably high, we have learned from our recent experience with ammonia that using a wide variety of credible measurement techniques, rather than focusing on one so-called “standard” technique, may be the better way to improve confidence in our estimates. Whereas the most promising control measures for gaseous emissions continue to be dietary strategies with management of corral-surface moisture a close second for particulate matter, corral-surface management and moisture management play comparable roles, depending on the mechanical strength of soils and the availability of water, respectively. The cost per unit reduction of emitted mass attributable to these abatement measures varies as widely as the emissions estimates themselves, so we need to intensify our emphasis on process-based emissions research to (a) reduce the variances in our emissions estimates and (b) mitigate the contingency of prior, empirically based estimates. As a general rule, although cattle feedyard emission factors may be thought a reasonable starting point for estimating emissions from open-lot dairies, such estimates should be viewed with suspicion.
Air Pollution and Multiple Sclerosis: An Ecological StudyDataNB
The objective of this research is to explore the link between air pollution and multiple sclerosis (MS). MS is a chronic progressive neurological disease of young adulthood that results in significant physical and cognitive disability. Rates of MS in NB are among the highest in the world. Greater exposure to air pollution has been previously implicated as risk factor and basic science studies demonstrate that pollutants can cross the blood brain barrier. We previously conducted a prevalence study and identified regional variation in MS prevalence in NB. To explore this geographic variability, we compared air pollution levels and MS prevalence using data housed at NB-IRDT. We stratified MS cases by geography, into one of the thirty-three Health Council Communities (HCCs), and assigned long-term air pollution levels (i.e. particulate matter <2.5µm (PM2.5), nitrogen dioxide (NO2), sulphur dioxide (SO2) and ozone (O3)), from the Canadian Urban Environmental Health Research Consortium (CANUE) to each HCC. Average pollutant levels were all below established Canadian air quality standards. We found PM2.5 was positively associated with MS prevalence. Our results offer additional evidence for a link between ambient PM2.5 and MS, even in areas with low air pollution levels such as NB.
L’aria è elemento essenziale per la vita dell’uomo.La “mission” di questo blog è quello di soddisfare le esigenze di ricerca e di conoscenza delle tecnologie che possono permettere alle persone di respirare ogni giorno un’aria più pulita e sana, migliorando la qualità e la durata della loro vita.
Species distribution models (SDMs) are increasingly used to address
numerous questions in ecology, biogeography, conservation biology and evolution.
Surprisingly, the crucial step of selecting the most relevant variables has
received little attention, despite its direct implications for model transferability
and uncertainty. Here, we aim to address this with a continent-wide, evaluation
of which climate predictors provided the most accurate SDMs for bird
distributions.
A typical problem in environmental epidemiological studies concerns environmental exposure assessment. In this talk, we will discuss challenges to environmental exposure assessment and we will showcase and discuss statistical methods that have been developed to obtain estimates of environmental exposure (e.g. air pollution, temperature). Further we will discuss whether and how uncertainty in the environmental exposure has been and can be incorporated in health analyses.
Similar to Program on Mathematical and Statistical Methods for Climate and the Earth System Opening Workshop, The Effects of Climate Change on Human Health through Changes in Air Quality - Jason West, Aug 23, 2017
Active travel: Benefits and trade-offs - Audrey de NazelleIES / IAQM
Walking and cycling as means of transportation offer convenient and low-cost opportunities to integrate healthy physical activity behaviour in daily patterns of activity. It is thus seen as an essential component of tackling physical inactivity, one of the world’s greatest public health challenges of today. However in urban environments, there may be trade-offs associated with active travel. Pedestrians and cyclists will typically inhale greater amounts of pollutants compared to other mode users in particular. This presentation will review some of the latest research on combined effects of physical activity and pollution, and discuss other potential benefits and trade-offs associated with active travel policies.
Impact on Air Quality and Climate Change: Where the Dairy Industry Stands- Fr...DAIReXNET
Dr. Frank Mitloehner presented these materials as part of DAIReXNET's April 4, 2011 webinar entitled "Impact on Air Quality and Climate Change- Where the Dairy Industry Stands."
Civic Exchange - 2009 The Air We Breathe Conference - Science to Policy - pre...Civic Exchange
Civic Exchange 2009 The Air We Breathe Conference - Experts Symposium 9 January 2009
Science to Policy
presented by Ross Anderson (St George's, University of London)
http://air.dialogue.org.hk
The health implications associated with short- and long-term exposure to particulate matter measuring less than 2.5 microns (PM 2.5) continues to raise concern. Certain health effects, such as asthma and chronic obstructive pulmonary disease (COPD), have long been associated with PM 2.5 exposure. Research into the association between respiratory conditions and PM 2.5 have been the basis for air quality regulations; however, recent literature suggests that PM 2.5 exposure may lead to far more adverse health effects such as cardiovascular disease, hypertension, and low birth weight. Additionally, it now appears that PM 2.5 may follow a non-threshold linear dose-response model, meaning there may be no safe level of PM 2.5. If this is the case, even stricter regulations will follow, putting more pressure on industry to lower the output of PM 2.5. It will also pave the way for unlimited litigation for personal harm and liability. As research involving PM 2.5 exposure and human health continues, businesses must be prepared for the coming onslaught of law suits and ever-increasing demands to remain in compliance with stricter regulations.
Presentation by Vlatka Matkovic at the OpenDataDay event 'Towards Clean Air with Open Data'. The event took place at BeCentral in Brussels on Saturday 3 March 2018.
First appearing on the blog of Donna LaFramboise, this draft was confirmed as authentic by an IPCC spokesman, according to Justin Gills of The New York Times. Here's the blog post: http://nofrakkingconsensus.com/2013/11/01/new-ipcc-leak-working-group-2s-summary-for-policymakers/
Here's Gillis's news story, which focuses on the draft's conclusions about agriculture: Climate Change Seen Posing Risk to Food Supplies http://nyti.ms/1iBa1tR
Modelling Grasslands GHG Balances_Dr Gary J Lanigan
LiveM_Macsur_Bilbao_2014
Similar to Program on Mathematical and Statistical Methods for Climate and the Earth System Opening Workshop, The Effects of Climate Change on Human Health through Changes in Air Quality - Jason West, Aug 23, 2017 (20)
Recently, the machine learning community has expressed strong interest in applying latent variable modeling strategies to causal inference problems with unobserved confounding. Here, I discuss one of the big debates that occurred over the past year, and how we can move forward. I will focus specifically on the failure of point identification in this setting, and discuss how this can be used to design flexible sensitivity analyses that cleanly separate identified and unidentified components of the causal model.
I will discuss paradigmatic statistical models of inference and learning from high dimensional data, such as sparse PCA and the perceptron neural network, in the sub-linear sparsity regime. In this limit the underlying hidden signal, i.e., the low-rank matrix in PCA or the neural network weights, has a number of non-zero components that scales sub-linearly with the total dimension of the vector. I will provide explicit low-dimensional variational formulas for the asymptotic mutual information between the signal and the data in suitable sparse limits. In the setting of support recovery these formulas imply sharp 0-1 phase transitions for the asymptotic minimum mean-square-error (or generalization error in the neural network setting). A similar phase transition was analyzed recently in the context of sparse high-dimensional linear regression by Reeves et al.
Many different measurement techniques are used to record neural activity in the brains of different organisms, including fMRI, EEG, MEG, lightsheet microscopy and direct recordings with electrodes. Each of these measurement modes have their advantages and disadvantages concerning the resolution of the data in space and time, the directness of measurement of the neural activity and which organisms they can be applied to. For some of these modes and for some organisms, significant amounts of data are now available in large standardized open-source datasets. I will report on our efforts to apply causal discovery algorithms to, among others, fMRI data from the Human Connectome Project, and to lightsheet microscopy data from zebrafish larvae. In particular, I will focus on the challenges we have faced both in terms of the nature of the data and the computational features of the discovery algorithms, as well as the modeling of experimental interventions.
Bayesian Additive Regression Trees (BART) has been shown to be an effective framework for modeling nonlinear regression functions, with strong predictive performance in a variety of contexts. The BART prior over a regression function is defined by independent prior distributions on tree structure and leaf or end-node parameters. In observational data settings, Bayesian Causal Forests (BCF) has successfully adapted BART for estimating heterogeneous treatment effects, particularly in cases where standard methods yield biased estimates due to strong confounding.
We introduce BART with Targeted Smoothing, an extension which induces smoothness over a single covariate by replacing independent Gaussian leaf priors with smooth functions. We then introduce a new version of the Bayesian Causal Forest prior, which incorporates targeted smoothing for modeling heterogeneous treatment effects which vary smoothly over a target covariate. We demonstrate the utility of this approach by applying our model to a timely women's health and policy problem: comparing two dosing regimens for an early medical abortion protocol, where the outcome of interest is the probability of a successful early medical abortion procedure at varying gestational ages, conditional on patient covariates. We discuss the benefits of this approach in other women’s health and obstetrics modeling problems where gestational age is a typical covariate.
Difference-in-differences is a widely used evaluation strategy that draws causal inference from observational panel data. Its causal identification relies on the assumption of parallel trends, which is scale-dependent and may be questionable in some applications. A common alternative is a regression model that adjusts for the lagged dependent variable, which rests on the assumption of ignorability conditional on past outcomes. In the context of linear models, Angrist and Pischke (2009) show that the difference-in-differences and lagged-dependent-variable regression estimates have a bracketing relationship. Namely, for a true positive effect, if ignorability is correct, then mistakenly assuming parallel trends will overestimate the effect; in contrast, if the parallel trends assumption is correct, then mistakenly assuming ignorability will underestimate the effect. We show that the same bracketing relationship holds in general nonparametric (model-free) settings. We also extend the result to semiparametric estimation based on inverse probability weighting.
We develop sensitivity analyses for weak nulls in matched observational studies while allowing unit-level treatment effects to vary. In contrast to randomized experiments and paired observational studies, we show for general matched designs that over a large class of test statistics, any valid sensitivity analysis for the weak null must be unnecessarily conservative if Fisher's sharp null of no treatment effect for any individual also holds. We present a sensitivity analysis valid for the weak null, and illustrate why it is conservative if the sharp null holds through connections to inverse probability weighted estimators. An alternative procedure is presented that is asymptotically sharp if treatment effects are constant, and is valid for the weak null under additional assumptions which may be deemed reasonable by practitioners. The methods may be applied to matched observational studies constructed using any optimal without-replacement matching algorithm, allowing practitioners to assess robustness to hidden bias while allowing for treatment effect heterogeneity.
The world of health care is full of policy interventions: a state expands eligibility rules for its Medicaid program, a medical society changes its recommendations for screening frequency, a hospital implements a new care coordination program. After a policy change, we often want to know, “Did it work?” This is a causal question; we want to know whether the policy CAUSED outcomes to change. One popular way of estimating causal effects of policy interventions is a difference-in-differences study. In this controlled pre-post design, we measure the change in outcomes of people who are exposed to the new policy, comparing average outcomes before and after the policy is implemented. We contrast that change to the change over the same time period in people who were not exposed to the new policy. The differential change in the treated group’s outcomes, compared to the change in the comparison group’s outcomes, may be interpreted as the causal effect of the policy. To do so, we must assume that the comparison group’s outcome change is a good proxy for the treated group’s (counterfactual) outcome change in the absence of the policy. This conceptual simplicity and wide applicability in policy settings makes difference-in-differences an appealing study design. However, the apparent simplicity belies a thicket of conceptual, causal, and statistical complexity. In this talk, I will introduce the fundamentals of difference-in-differences studies and discuss recent innovations including key assumptions and ways to assess their plausibility, estimation, inference, and robustness checks.
We present recent advances and statistical developments for evaluating Dynamic Treatment Regimes (DTR), which allow the treatment to be dynamically tailored according to evolving subject-level data. Identification of an optimal DTR is a key component for precision medicine and personalized health care. Specific topics covered in this talk include several recent projects with robust and flexible methods developed for the above research area. We will first introduce a dynamic statistical learning method, adaptive contrast weighted learning (ACWL), which combines doubly robust semiparametric regression estimators with flexible machine learning methods. We will further develop a tree-based reinforcement learning (T-RL) method, which builds an unsupervised decision tree that maintains the nature of batch-mode reinforcement learning. Unlike ACWL, T-RL handles the optimization problem with multiple treatment comparisons directly through a purity measure constructed with augmented inverse probability weighted estimators. T-RL is robust, efficient and easy to interpret for the identification of optimal DTRs. However, ACWL seems more robust against tree-type misspecification than T-RL when the true optimal DTR is non-tree-type. At the end of this talk, we will also present a new Stochastic-Tree Search method called ST-RL for evaluating optimal DTRs.
A fundamental feature of evaluating causal health effects of air quality regulations is that air pollution moves through space, rendering health outcomes at a particular population location dependent upon regulatory actions taken at multiple, possibly distant, pollution sources. Motivated by studies of the public-health impacts of power plant regulations in the U.S., this talk introduces the novel setting of bipartite causal inference with interference, which arises when 1) treatments are defined on observational units that are distinct from those at which outcomes are measured and 2) there is interference between units in the sense that outcomes for some units depend on the treatments assigned to many other units. Interference in this setting arises due to complex exposure patterns dictated by physical-chemical atmospheric processes of pollution transport, with intervention effects framed as propagating across a bipartite network of power plants and residential zip codes. New causal estimands are introduced for the bipartite setting, along with an estimation approach based on generalized propensity scores for treatments on a network. The new methods are deployed to estimate how emission-reduction technologies implemented at coal-fired power plants causally affect health outcomes among Medicare beneficiaries in the U.S.
Laine Thomas presented information about how causal inference is being used to determine the cost/benefit of the two most common surgical surgical treatments for women - hysterectomy and myomectomy.
We provide an overview of some recent developments in machine learning tools for dynamic treatment regime discovery in precision medicine. The first development is a new off-policy reinforcement learning tool for continual learning in mobile health to enable patients with type 1 diabetes to exercise safely. The second development is a new inverse reinforcement learning tools which enables use of observational data to learn how clinicians balance competing priorities for treating depression and mania in patients with bipolar disorder. Both practical and technical challenges are discussed.
The method of differences-in-differences (DID) is widely used to estimate causal effects. The primary advantage of DID is that it can account for time-invariant bias from unobserved confounders. However, the standard DID estimator will be biased if there is an interaction between history in the after period and the groups. That is, bias will be present if an event besides the treatment occurs at the same time and affects the treated group in a differential fashion. We present a method of bounds based on DID that accounts for an unmeasured confounder that has a differential effect in the post-treatment time period. These DID bracketing bounds are simple to implement and only require partitioning the controls into two separate groups. We also develop two key extensions for DID bracketing bounds. First, we develop a new falsification test to probe the key assumption that is necessary for the bounds estimator to provide consistent estimates of the treatment effect. Next, we develop a method of sensitivity analysis that adjusts the bounds for possible bias based on differences between the treated and control units from the pretreatment period. We apply these DID bracketing bounds and the new methods we develop to an application on the effect of voter identification laws on turnout. Specifically, we focus estimating whether the enactment of voter identification laws in Georgia and Indiana had an effect on voter turnout.
We study experimental design in large-scale stochastic systems with substantial uncertainty and structured cross-unit interference. We consider the problem of a platform that seeks to optimize supply-side payments p in a centralized marketplace where different suppliers interact via their effects on the overall supply-demand equilibrium, and propose a class of local experimentation schemes that can be used to optimize these payments without perturbing the overall market equilibrium. We show that, as the system size grows, our scheme can estimate the gradient of the platform’s utility with respect to p while perturbing the overall market equilibrium by only a vanishingly small amount. We can then use these gradient estimates to optimize p via any stochastic first-order optimization method. These results stem from the insight that, while the system involves a large number of interacting units, any interference can only be channeled through a small number of key statistics, and this structure allows us to accurately predict feedback effects that arise from global system changes using only information collected while remaining in equilibrium.
We discuss a general roadmap for generating causal inference based on observational studies used to general real world evidence. We review targeted minimum loss estimation (TMLE), which provides a general template for the construction of asymptotically efficient plug-in estimators of a target estimand for realistic (i.e, infinite dimensional) statistical models. TMLE is a two stage procedure that first involves using ensemble machine learning termed super-learning to estimate the relevant stochastic relations between the treatment, censoring, covariates and outcome of interest. The super-learner allows one to fully utilize all the advances in machine learning (in addition to more conventional parametric model based estimators) to build a single most powerful ensemble machine learning algorithm. We present Highly Adaptive Lasso as an important machine learning algorithm to include.
In the second step, the TMLE involves maximizing a parametric likelihood along a so-called least favorable parametric model through the super-learner fit of the relevant stochastic relations in the observed data. This second step bridges the state of the art in machine learning to estimators of target estimands for which statistical inference is available (i.e, confidence intervals, p-values etc). We also review recent advances in collaborative TMLE in which the fit of the treatment and censoring mechanism is tailored w.r.t. performance of TMLE. We also discuss asymptotically valid bootstrap based inference. Simulations and data analyses are provided as demonstrations.
We describe different approaches for specifying models and prior distributions for estimating heterogeneous treatment effects using Bayesian nonparametric models. We make an affirmative case for direct, informative (or partially informative) prior distributions on heterogeneous treatment effects, especially when treatment effect size and treatment effect variation is small relative to other sources of variability. We also consider how to provide scientifically meaningful summaries of complicated, high-dimensional posterior distributions over heterogeneous treatment effects with appropriate measures of uncertainty.
Climate change mitigation has traditionally been analyzed as some version of a public goods game (PGG) in which a group is most successful if everybody contributes, but players are best off individually by not contributing anything (i.e., “free-riding”)—thereby creating a social dilemma. Analysis of climate change using the PGG and its variants has helped explain why global cooperation on GHG reductions is so difficult, as nations have an incentive to free-ride on the reductions of others. Rather than inspire collective action, it seems that the lack of progress in addressing the climate crisis is driving the search for a “quick fix” technological solution that circumvents the need for cooperation.
This seminar discussed ways in which to produce professional academic writing, from academic papers to research proposals or technical writing in general.
Machine learning (including deep and reinforcement learning) and blockchain are two of the most noticeable technologies in recent years. The first one is the foundation of artificial intelligence and big data, and the second one has significantly disrupted the financial industry. Both technologies are data-driven, and thus there are rapidly growing interests in integrating them for more secure and efficient data sharing and analysis. In this paper, we review the research on combining blockchain and machine learning technologies and demonstrate that they can collaborate efficiently and effectively. In the end, we point out some future directions and expect more researches on deeper integration of the two promising technologies.
In this talk, we discuss QuTrack, a Blockchain-based approach to track experiment and model changes primarily for AI and ML models. In addition, we discuss how change analytics can be used for process improvement and to enhance the model development and deployment processes.
More from The Statistical and Applied Mathematical Sciences Institute (20)
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
The Indian economy is classified into different sectors to simplify the analysis and understanding of economic activities. For Class 10, it's essential to grasp the sectors of the Indian economy, understand their characteristics, and recognize their importance. This guide will provide detailed notes on the Sectors of the Indian Economy Class 10, using specific long-tail keywords to enhance comprehension.
For more information, visit-www.vavaclasses.com
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
Program on Mathematical and Statistical Methods for Climate and the Earth System Opening Workshop, The Effects of Climate Change on Human Health through Changes in Air Quality - Jason West, Aug 23, 2017
1. The Effects of Climate Change on
Human Health Through Changes in
Air Quality
J. Jason West
Department of Environmental
Sciences & Engineering
University of North Carolina,
Chapel Hill
2. Beijing – January 14, 2013
Some stations reported >500 µg m-3 24-hr avg.
PM2.5 (>20 times the WHO guideline)
3.
4. • Air pollution is underappreciated for global health.
• Air pollution and its health impacts are changing globally, and
will change in ways interrelated with climate change.
• Air pollution science offers new possibilities: new
measurement methods measuring more chemical
components, cheap sensors that can be widely deployed,
satellites, and models.
• There is a need for the communities of air pollution science
and air pollution health effects science to work together
better.
5. Plan for today
1) How many people die each year due to
exposure to ambient air pollution?
2) How will climate change affect air
pollution and air pollution-related
deaths?
3) If we slow down climate change, what
are the benefits for air pollution and
health?
6. Meteorology
Emissions
Typical horizontal resolutions:
Global model – 5 to 0.5 degrees
Regional model (continental) – 50 to 10 km
Concentrations
Chemical
Mechanism
Modeling Global Air Pollution & Health
Health
Function
Incidence
Data
Health
Impacts
7.
8. 8
Health impact function
∆Mort = y0 x AF x Pop
β = Concentration-
response factor
Baseline
mortality rate
Exposed
Population
- Respiratory diseases (RESP)
(inc. COPD – chronic obstructive
pulmonary disease)
- Cardiovascular diseases
(inc. IHD – ischemic heart disease,
STROKE – cerebrovascular disease)
- Lung Cancer (LC)
(≈ CPD)
∆Mort = y0 x (1-exp-βΔX) x Pop
ΔX = Change in
concentration
9. 9
Population and Baseline Mortality Rates
Total Population, persons
(Landscan 2011 at 30”x30”
gridded to 0.67°x0.5°)
Baseline Mortality Rates, deaths per year per 100,000
(GBD 2010, country level, AllAges > gridded to 0.67°x0.5°)
IHD
COPD
Stroke
LC
3,839 million
Population 25+, persons
(Landscan 2011 at 30”x30”
gridded to 0.67°x0.5°)
6,946 million
11. Bias in US Deaths from PM2.5
0
20,000
40,000
60,000
80,000
0 100 200 300 400
Deaths/Year
Grid Resolution (km)
CMAQ Model Output
(12km-grid)
Punger and West (AQAH, 2013)
30~40% lower than
12km grid estimate
11
12. 0
2
4
6
8
10
12
0 100 200 300 400 500
PWC(µg/m3)
Grid Resolution (km)
CMAQ
Satellite
Akita et al. in prep.
US Bias is different for Satellite PM2.5!
13. -20% Global Anthrop. Methane Emissions:
30,200 avoided premature deaths in 2030
due to reduced ozone
West et al., PNAS, 2006
14. Ozone from N.
American and
European emissions
causes more deaths
outside of those
regions than within
Avoided deaths (hundreds) from 20% regional ozone precursor reductions, based on
HTAP simulations, Anenberg et al. (EST, 2009)
15. 15
Global mortality burden – ACCMIP ensemble
Ozone-related mortality PM2.5-related mortality(*)
470,000 (95% CI: 140,000 - 900,000) 2.1 million (95% CI: 1.3 - 3.0 million)
(*) PM2.5 calculated as a sum of species (dark blue)
PM2.5 as reported by 4 models (dark green)
Light-colored bars - low-concentration threshold (5.8 µg m-3)
Silva et al. (ERL, 2013)
16. 16
Global Burden: Ozone-related mortality
Respiratory mortality , deaths yr-1 (1000 km2)-1,
multi-model mean in each grid cell , 14 models
Global and regional mortality per year
Regions
Total
deaths
Deaths
per
million
people (*)
North America 34,400 121
Europe 32,800 96
Former Soviet Union 10,600 66
Middle East 16,200 68
India 118,000 212
East Asia 203,000 230
Southeast Asia 33,300 119
South America 6,970 38
Africa 17,300 73
Australia 469 29
Global 472,000 149
(*) Exposed population (age 30 and older)
Silva et al. (ERL 2013)
17. 17
Global Burden: PM2.5-related mortality
CPD+LC mortality , deaths yr-1 (1000 km2)-1,
multi-model mean in each grid cell , 6 models
Global and regional mortality per year
Regions
Total
deaths
Deaths
per
million
people (*)
North America 43,000 152
Europe 154,000 448
Former Soviet
Union
128,000 793
Middle East 88,700 371
India 397,000 715
East Asia 1,049,000 1,191
Southeast Asia 158,000 564
South America 16,800 92
Africa 77,500 327
Australia 1,250 78
Global 2,110,000 665
1
(*) Exposed population (age 30 and older)
Silva et al. (ERL 2013)
18. Global burden of disease of air pollution (2015)
Cohen et al., Lancet, 2017
Global Deaths perYear
Ambient PM2.5 pollution: 4.2 (3.7 – 4.8) million
Ambient ozone pollution: 0.25 (0.10 – 0.42) million
Household air pollution from solid fuels: 2.8 (2.2 – 3.6) million
1 in 12 deaths
globally!
19.
20. GBD: US Deaths perYear (2015)
Ambient PM2.5 pollution: 88,000
Ambient ozone pollution: 12,000
Household air pollution from solid fuels: 0
Other US estimates
Ambient PM2.5 pollution: 66,000 (39,000 – 85,000) (Punger & West, 2013)
130,000 (Fann et al., 2012)
Ambient ozone pollution: 21,000 (6,000 – 34,000) (Punger & West, 2013)
4,700 (Fann et al., 2012)
1 in 26 US deaths!
21. 21
Ozone-related mortality (sectors zeroed-out)
Contributions of each sector to total ozone respiratory mortality,
fraction of total burden in each cell
Land Transportation
Energy Industry
Residential & Commercial
Silva et al. (EHP, 2016)
Global total: 45,600 deaths/yearGlobal total: 65,200 deaths/year
Global total: 80,900 deaths/year Global total: 53,700 deaths/year
22. 22
PM2.5-related mortality (sectors zeroed-out)
Contributions of each sector to total PM2.5 mortality (IHD+Stroke+COPD+LC),
fraction of total burden in each cell
Land Transportation
Energy Industry
Residential & Commercial
Global total: 290,000 deaths/year Global total: 323,000 deaths/year
Global total: 212,000 deaths/year Global total: 675,000 deaths/year
Silva et al. (EHP, 2016)
23. 23
Future ozone-related mortality - ACCMIP
Global Respiratory Premature Ozone Mortality: 2030, 2050 and 2100 vs. 2000 conc.
- Uncertainty for the ensemble mean is a 95% CI including uncertainty in RR and across models. -
Silva et al. (ACP, 2016)
24. 24
Future PM2.5-related mortality - ACCMIP
Global CPD+LC Premature PM2.5 Mortality: 2030, 2050 and 2100 vs. 2000 conc.
-Uncertainty for the ensemble mean is a 95% CI including uncertainty in the RR and across models. –
Silva et al. (ACP 2016)
26. Summary schematic of air quality-climate connections
Figure 2, Fiore, Naik, Leibensperger, JAWMA, 2015
Climate Air Pollution
27. Impact of RCP8.5 Climate Change on Global Air
Pollution Mortality: ACCMIP Models
Mean (95% CI)
(thousands deaths yr-1)
2030
3 (-30, 47)
2100
44 (-195, 237)
Mean (95% CI)
(thousand deaths . yr-1)
2030
56 (-34 , 164)
2100
215 (-76 , 595)
Silva et al. Nat Clim Ch, 2017
Million deaths yr-1OZONE
PM2.5
28. Impact of RCP8.5 Climate Change on Global Air
Pollution Mortality: ACCMIP Models
Silva et al. Nat Clim Ch, 2017
29. Impact of RCP8.5 Climate Change on Global Air
Pollution Mortality: ACCMIP Models
Silva et al. Nat Clim Ch, 2017
30. EPA project with Columbia Univ (Arlene Fiore, PI):
Large multi-chemistry-climate-model ensembles and high-res
downscaling, evaluated with observations
CM3 model [e.g., Donner et al., 2011]
2 CMIP5/IPCC AR5 Chemistry-climate models
• 3-member ensembles for 2006-2100 RCP4.5, RCP8.5; RCP8.5_WMGG
(climate change from WMGHGs but PM + precursor emissions held at 2005)
• 400 years from “2000 Control” to examine climate variability
Select targeted years in ~6 U.S. regions for downscaling with CMAQ
[e.g., Neale et al., 2012]
• 40-member large ensemble with monthly PM2.5 components archived
• ~15-member medium ensembles with daily PM2.5 components archived
• 400 years from “2000 Control” to examine climate variability
Provide statistical context for GFDL-CMAQ simulations + climate variability
Targeted high-resolution downscaling with CMAQ
• CONUS, multiple selected years (mid-century)
• Examine sensitivities (vulnerability) to meteorologically-sensitive emissions
Combine fine-scale information with statistical power of large ensemble
31. Co-benefits of GHG Mitigation for Air
Quality
Sources &
Policies
Air
pollutants
GHGs
Air
pollution
Climate
Change
Objective: Analyze global co-benefits for air
quality and human health to 2100 via
both mechanisms.
1) Immediate and Local
2) Long-Term and Global
Human
Health
32. Approach
• Use the GCAM reference for emissions rather than
RCP8.5, for consistency with RCP4.5.
• Simulations conducted in MOZART-4.
- 2° x 2.5° horizontal resolution.
- 5 meteorology years for each case.
- Fixed methane concentrations.
- Compares well with ACCMIP RCP4.5.
Years Emissions
GCAM
Meteorology
GFDL AM3
Name
2000 2000 2000 2000
2030,
2050,
2100
GCAM
Reference
RCP8.5 REF
RCP4.5 RCP4.5 RCP4.5
GCAM
Reference
RCP4.5 eREFm45
33. Co-benefits – PM2.5 Concentration
Global population-weighted, annual average PM2.5
West et al. NCC 2013
34. Co-benefits – Global Premature Mortality
PM2.5 co-benefits
(CPD + lung cancer mortality)
2030: 0.4±0.2 million yr-1
2050: 1.1±0.5
2100: 1.5±0.6
Ozone co-benefits
(respiratory mortality)
2030: 0.09±0.06
2050: 0.2±0.1
2100: 0.7±0.5
Projection of global
population and
baseline mortality
rates from
International Futures.
West et al. NCC 2013
35. Co-benefits – Valuation of Avoided Mortality
Red: High valuation (2030 global mean $3.6 million)
Blue: Low valuation (2030 global mean $1.2 million)
Green: Median and range of global C price (13 models)
West et al. NCC 2013
36. Downscaling Co-benefits to USA (2050)
Zhang et al. ACP, 2016
PM2.5 (annual avg.)
US mean = 0.47 µg/m3
Ozone (May-Oct MDA8)
US mean = 3.55 ppbv
RCP4.5 - REF
(b)(a)
37. Downscaling Co-benefits to USA (2050)
Zhang et al. ACP, 2016
Domestic
Ozone
Most PM2.5 co-
benefits from
domestic
reductions.
Most ozone
co-benefits
from foreign
and methane
reductions.
PM2.5
0.35 µg/m3
0.12 µg/m3
Foreign
0.86 ppb
2.69 ppb
38. Domestic vs. Foreign Co-benefits: PM2.5
Zhang et al. ERL submitted
Domestic Foreign
Domestic GHG mitigation accounts for 85% of the total avoided
PM2.5 mortality.
39. Domestic vs. Foreign Co-benefits: O3
Zhang et al. ERL, submitted
Domestic Foreign
Foreign countries’ GHG mitigation accounts for 62% of the total
avoided deaths of O3.
40. US Co-benefits in 2050
• Avoided premature deaths from GHG mitigation: 16000 (CI:
11700-20300) from PM2.5, and 8000 (CI: 3600-12400) from O3.
• Avoided heat stress mortality from RCP4.5 relative to RCP8.5:
2340 (CI: 1370-3320) (Ying Li).
• Monetized co-benefits in 2050 are $49 (32-67) per ton CO2
reduced at low VSL, $148 (96-201) at high VSL.
• Foreign GHG mitigation accounts for 62% of the total avoided
deaths from O3, and 15% for PM2.5.
• Previous regional or national co‐benefits studies may
underestimate the full co‐benefits of coordinated global actions.
• U.S. can gain significantly greater co‐benefits, especially for
ozone, by collaborating with other countries to combat climate
change.
Zhang et al. ERL, submitted; Li et al in preparation
41. Thank you
Contributions from
Students: Raquel Silva, Yuqiang Zhang, Susan Anenberg,
Zac Adelman, Meridith Fry
Others: Steve Smith, Vaishali Naik, Larry Horowitz, Jared Bowden,
Jean-Francois Lamarque, Drew Shindell, ACCMIP modelers
Funding Sources:
• EPA STAR Grant #834285
• NIEHS Grant #1 R21 ES022600-01
• EPA Office of Air Quality Planning and Standards
• Portugal Foundation for Science and Technology Fellowship
• EPA STAR Fellowship
• US Department of Energy, Office of Science
• Merck Foundation
• International Council on Clean Transportation
• NOAA GFDL for computing resources
834285
UNC Climate Health and Air Quality Lab
www.unc.edu/~jjwest
45. Our co-benefits approach: advantages
• First co-benefits study to use a global atmospheric
model.
– Capture effects of long-range transport and methane.
• First to estimate co-benefits by two mechanisms.
• Use consistent future scenarios built on RCP4.5:
emissions, population, economics (valuation).
• By embedding the US study within a prior global study,
we capture US co-benefits from foreign reductions.
46. Results – PM2.5 Concentration
2050
2100
Annual average PM2.5
Total change
RCP4.5 - REF
Meteorology
eREFm45 - REF
Emissions
RCP4.5 – eREFm45
West et al. NCC 2013
47. Results – Ozone Concentration
Global population-weighted,
max. 6 month average of 1 hr. daily max ozone West et al. NCC 2013
48. Results – Ozone Concentration
2050
2100
Max. 6 month average of 1 hr. daily max ozone
Total co-benefit #2 Meteorology #1 Emissions
Meteorology
eREFm45 - REF
Total change
RCP4.5 - REF
Emissions
RCP4.5 - eREFm45
West et al. NCC 2013
50. Co-benefits: conclusions
• Global abatement of GHG emissions brings substantial
air quality and human health co-benefits.
• Global GHG mitigation (RCP4.5 relative to REF)
causes 0.5±0.2 million avoided deaths in 2030, 1.3±0.5
in 2050, and 2.2±0.8 in 2100
• Global average monetized co-benefits are $50-380 /
ton CO2
– Greater than previous estimates
– Greater than abatement costs in 2030 and 2050.
• The direct co-benefits from air pollutant emission
reductions exceed those via slowing climate change.
West et al. NCC 2013