Climate change is projected to impact drastically in southern African during the 21st century
under low mitigation futures (Niang et al., 2014). African temperatures are projected to rise
rapidly, in the subtropics at least at 1.5 times the global rate of temperature increase (James
and Washington, 2013; Engelbrecht et al., 2015). Moreover, the southern African region is
projected to become generally drier under enhanced anthropogenic forcing (Christensen et
al., 2007; Engelbrecht et al., 2009; James and Washington, 2013; Niang et al., 2014). These
changes in temperature and rainfall patterns will plausibly have a range of impacts in South
Africa, including impacts on energy demand (in terms of achieving human comfort within
buildings and factories), agriculture (e.g. reductions of yield in the maize crop under higher
temperatures and reduced soil moisture), livestock production (e.g. higher cattle mortality as
a result of oppressive temperatures) and water security (through reduced rainfall and
enhanced evapotranspiration) (Engelbrecht et al., 2015).
The climate and earth sciences have recently undergone a rapid transformation from a data-poor
to a data-rich environment. In particular, massive amount of data about Earth and its
environment is now continuously being generated by a large number of Earth observing satellites
as well as physics-based earth system models running on large-scale computational platforms.
These massive and information-rich datasets offer huge potential for understanding how the
Earth's climate and ecosystem have been changing and how they are being impacted by humans’
actions. This talk will discuss various challenges involved in analyzing these massive data sets
as well as opportunities they present for both advancing machine learning as well as the science
of climate change in the context of monitoring the state of the tropical forests and surface water
on a global scale.
Satellite passive microwave measurements of the climate crisisChelle Gentemann
Invited presentation at the NASEM Committee on Radio Frequencies 2021 Fall Meeting. An overview of how passive microwave measurements are used to understand climate change.
Hurricanes and Global Warming- Dr. Kerry EmanuelJohn Atkeison
Dr. Kerry Emanuel explains how Global Warming increased the power of hurricanes. Hurricane Katrina is discussed, with the conclusion that Katrina probably would not have had the power to break the New Orleans levees in a pre-Global Warming world. April 2009 webinar presented by the Southern Allicance for Clean Energy (http://www.cleanenergy.org/) and the Gulf Restoration Network (http://healthygulf.org/) SlideCast by John Atkeison of the Alliance for Affordable Energy. There is a very small amount of phone noise.
Climate change is projected to impact drastically in southern African during the 21st century
under low mitigation futures (Niang et al., 2014). African temperatures are projected to rise
rapidly, in the subtropics at least at 1.5 times the global rate of temperature increase (James
and Washington, 2013; Engelbrecht et al., 2015). Moreover, the southern African region is
projected to become generally drier under enhanced anthropogenic forcing (Christensen et
al., 2007; Engelbrecht et al., 2009; James and Washington, 2013; Niang et al., 2014). These
changes in temperature and rainfall patterns will plausibly have a range of impacts in South
Africa, including impacts on energy demand (in terms of achieving human comfort within
buildings and factories), agriculture (e.g. reductions of yield in the maize crop under higher
temperatures and reduced soil moisture), livestock production (e.g. higher cattle mortality as
a result of oppressive temperatures) and water security (through reduced rainfall and
enhanced evapotranspiration) (Engelbrecht et al., 2015).
The climate and earth sciences have recently undergone a rapid transformation from a data-poor
to a data-rich environment. In particular, massive amount of data about Earth and its
environment is now continuously being generated by a large number of Earth observing satellites
as well as physics-based earth system models running on large-scale computational platforms.
These massive and information-rich datasets offer huge potential for understanding how the
Earth's climate and ecosystem have been changing and how they are being impacted by humans’
actions. This talk will discuss various challenges involved in analyzing these massive data sets
as well as opportunities they present for both advancing machine learning as well as the science
of climate change in the context of monitoring the state of the tropical forests and surface water
on a global scale.
Satellite passive microwave measurements of the climate crisisChelle Gentemann
Invited presentation at the NASEM Committee on Radio Frequencies 2021 Fall Meeting. An overview of how passive microwave measurements are used to understand climate change.
Hurricanes and Global Warming- Dr. Kerry EmanuelJohn Atkeison
Dr. Kerry Emanuel explains how Global Warming increased the power of hurricanes. Hurricane Katrina is discussed, with the conclusion that Katrina probably would not have had the power to break the New Orleans levees in a pre-Global Warming world. April 2009 webinar presented by the Southern Allicance for Clean Energy (http://www.cleanenergy.org/) and the Gulf Restoration Network (http://healthygulf.org/) SlideCast by John Atkeison of the Alliance for Affordable Energy. There is a very small amount of phone noise.
Meridional brightness temperatures were measured on the surface of Titan during the 2004–2014 portion of the
Cassini mission by the Composite Infrared Spectrometer. Temperatures mapped from pole to pole during five twoyear
periods show a marked seasonal dependence. The surface temperature near the south pole over this time
decreased by 2 K from 91.7±0.3 to 89.7±0.5 K while at the north pole the temperature increased by 1 K from
90.7±0.5 to 91.5±0.2 K. The latitude of maximum temperature moved from 19 S to 16 N, tracking the subsolar
latitude. As the latitude changed, the maximum temperature remained constant at 93.65±0.15 K. In 2010
our temperatures repeated the north–south symmetry seen by Voyager one Titan year earlier in 1980. Early in the
mission, temperatures at all latitudes had agreed with GCM predictions, but by 2014 temperatures in the north were
lower than modeled by 1 K. The temperature rise in the north may be delayed by cooling of sea surfaces and moist
ground brought on by seasonal methane precipitation and evaporation.
Optimization and Statistical Learning in Geophysical Applications AndreyVlasenko15
The sea surface temperature (SST) anomalies are believed to be the most important factor of the forced atmospheric circulation, which causes temperature anomalies (i.e., anomalously warm/cold winter). We present two independent methods for predicting temperature over 2 meters(T2M) from SST. The first method uses a statistical learning algorithm based on maximum correlation analysis; the second method relies on algorithmic differentiation of the climate model: estimation of T2M sensitivity to SST.
Convective Heat Transfer Measurements at the Martian SurfaceSamet Baykul
DATE: 2018.11.28
This is a review of an article which introduces a new sensing method to characterize the convective wind activity on the surface of Mars
TOPICS:
• Introduction
• Objective of the Study
• Cited Studies
• Setup
• Mathematical Model
• Experimental Backing
• Results and Discussions
• Conclusions
• Suggestions
Climate change scenarios in context of the less than 2C global temperature ta...NAP Events
Presented by: Wilfran Moufouma-Okia
3.1 Technical guidance on NAPs
The session will take the participants through the technical guidance for NAPs, including: NAP guidelines, guiding principles for adaptation under the Convention, and subsequent products developed by the LEG such as the sample NAP process. It will further look detailed aspects on undertaking assessments by going through best available methods and tools for assessing for assessing crop production as an example. Countries will further provide practical experiences in applying the guidance in the formulation of their NAPs.
Meridional brightness temperatures were measured on the surface of Titan during the 2004–2014 portion of the
Cassini mission by the Composite Infrared Spectrometer. Temperatures mapped from pole to pole during five twoyear
periods show a marked seasonal dependence. The surface temperature near the south pole over this time
decreased by 2 K from 91.7±0.3 to 89.7±0.5 K while at the north pole the temperature increased by 1 K from
90.7±0.5 to 91.5±0.2 K. The latitude of maximum temperature moved from 19 S to 16 N, tracking the subsolar
latitude. As the latitude changed, the maximum temperature remained constant at 93.65±0.15 K. In 2010
our temperatures repeated the north–south symmetry seen by Voyager one Titan year earlier in 1980. Early in the
mission, temperatures at all latitudes had agreed with GCM predictions, but by 2014 temperatures in the north were
lower than modeled by 1 K. The temperature rise in the north may be delayed by cooling of sea surfaces and moist
ground brought on by seasonal methane precipitation and evaporation.
Optimization and Statistical Learning in Geophysical Applications AndreyVlasenko15
The sea surface temperature (SST) anomalies are believed to be the most important factor of the forced atmospheric circulation, which causes temperature anomalies (i.e., anomalously warm/cold winter). We present two independent methods for predicting temperature over 2 meters(T2M) from SST. The first method uses a statistical learning algorithm based on maximum correlation analysis; the second method relies on algorithmic differentiation of the climate model: estimation of T2M sensitivity to SST.
Convective Heat Transfer Measurements at the Martian SurfaceSamet Baykul
DATE: 2018.11.28
This is a review of an article which introduces a new sensing method to characterize the convective wind activity on the surface of Mars
TOPICS:
• Introduction
• Objective of the Study
• Cited Studies
• Setup
• Mathematical Model
• Experimental Backing
• Results and Discussions
• Conclusions
• Suggestions
Climate change scenarios in context of the less than 2C global temperature ta...NAP Events
Presented by: Wilfran Moufouma-Okia
3.1 Technical guidance on NAPs
The session will take the participants through the technical guidance for NAPs, including: NAP guidelines, guiding principles for adaptation under the Convention, and subsequent products developed by the LEG such as the sample NAP process. It will further look detailed aspects on undertaking assessments by going through best available methods and tools for assessing for assessing crop production as an example. Countries will further provide practical experiences in applying the guidance in the formulation of their NAPs.
As the analysis by Kalra & Paddock (2016) demonstrated, traditional crash data and analysis approaches may require hundreds of millions or billions of self-driving miles to achieve sufficient power to demonstrate that automated vehicles (AVs) have lower injury/fatality risk than human-driven vehicles. Moreover, crash risk for AVs is a moving target as algorithms and systems change, and the mistakes AVs will make are not necessarily the same mistakes humans make. Thus, we need to rethink both the data that will make up transportation safety datasets in the near future as well as the analytical approaches used. I will present some newer data-collection approaches along with some specific challenges that might call for different analytical approaches than are being used for crash data today.
Highway crash data with average of 39 thousand fatalities and 2.4 million nonfatal injuries per year have repetitive and predictable patterns, and may benefit from statistical predictive
models to enhance highway safety and operation efforts to reduce crash fatalities/injuries. Highway crashes have patterns that repeat over fixed periods of time within the data set for
crashes such as motorcycle, bicycles, pedestrians, nighttime, fixed object, weekend, and winter crashes. In some States, these crashes are weekly, monthly, or seasonally. Contributing
factors such as: age category, light condition, weather, weekday, underlying state of the economy, and others impact these variations.
This talk will review dynamic modeling and prediction for temporal and spatio-temporal data and describe algorithms for suitable state space models. Use of dynamic models for modeling crash types by severity will be briefly illustrated. Extension of these approaches for handling irregular temporal spacing and spatial sparseness will be discussed, and a potential application to travel time prediction will be explored.
Crashes on limited access roadways typically occur due to drivers being unable to react in time to avoid collisions with vehicles ahead of them either moving slower or merging
unexpectedly. Prevailing traffic stream conditions with high volume and low or variable speed downstream of low volume and high speed conditions can increase the possibilities for such collisions to occur. Real time trajectories of vehicles collected through crowd sourcing methods can give information about the distribution of speeds in the traffic stream by space
and time. Spatio-temporal models relating these observed speed distributions to the occurrence of crashes or near crashes can help to identify crash prone traffic conditions as
they arise, offering the opportunity to warn drivers before crashes occur.
As we prepare for a future of driverless cars, what new risks must we work to understand? Despite the connotation of driverless, we can expect that humans will remain in the loop at each iteration of increasingly autonomous technology integration. While our technology is advancing, our population and economics are also transitioning to present challenging paradigm shifts that we should account for in assessing the risks of driverless cars. Let us take this holistic systems engineering approach to exploring transportation at the Statistical and Applied Mathematical Sciences Institute.
Highlights topics of discussion on remote sensing during Day 1 of Program on Mathematical and Statistical Methods for Climate and the Earth System Opening Workshop.
Remote-sensing data offer unprecedented opportunities to address Earth-system-science challenges, such as understanding the relationship between the atmosphere and Earth's surface using physics, chemistry, biology, mathematics, and computing. Statistical methods have often been seen as a hybrid of the latter two, so that a lot of attention has been given to computing estimates but far less to quantifying the uncertainty of the estimates. In my "bird's-eye view," I shall give a way to look at the problem using conditional probability models and three states of knowledge. Examples will be given of analyzing remotely sensed data of a leading greenhouse gas, carbon dioxide.
COUNTER-INTUITY OF COMPLEX SYSTEMS: WEATHER VS. CLIMATEPaul H. Carr
Short-term weather fluctuations should not blind us from what long -term climate trends are telling us. Other unexpected aspects of complex system dynamics are the Butterfly Effect and the descendent benefit of epidemics.
The real reason for climate change and what is driving the changes. No model is any batter then the predictive results, the validation is therefore very easy to determine.
First lecture:
Climate Change and the New industrial revolution -
What we risk and how we should cast the economics and ethics
Speaker(s): Professor Lord Stern
Chair: Professor Lord Richard Layard
Recorded on 21 February 2012 in Old Theatre, Old Building
Forbes co2 and temperature presentation for earth day at cua april 22 2015 ...Kevin Forbes
Extended Abstract
Introduction
While the vast majority of climate scientists have concluded that the changes in the climate over the past few decades can be attributed to human activity [Doran and Zimmerman, 2009], there has been a degree of reluctance to attribute specific weather events to elevated CO2 concentrations. For example, Coumou and Rahmstorf [2012] have noted that there has been an exceptionally high incidence of extreme weather events over the past decade and that some of the events can be linked to climate change but nevertheless concede that particular events “cannot be directly attributed to global warming.” Moreover, the World Meteorological Organization has noted that the incidence of extreme weather events matches IPCC projections, but qualifies this conclusion by stating that “it is impossible to say that an individual weather or climate event was “caused” by climate change….” [World Meteorological Organization, 2011, p 15]. This claim of “attribution impossibility” is not a minor shortcoming; it leaves the causes of extreme events open to question, allowing climate skeptics to attribute the increased incidence of extreme events to so-called “natural variability.” In the United States, this has undermined the political consensus necessary to adopt robust, cost-effective policies to reduce CO2 emissions.
This paper explores the relationship between CO2 and weather by addressing whether there is a causal relationship between the atmospheric concentration level of carbon dioxide and hourly temperature. The analysis begins by noting that traditional correlation analysis is not capable of addressing whether there is a causal relationship between CO2 and temperature because statistical methods alone cannot render results that establish or reject causality between two variables that are contemporaneously correlated. Nevertheless, it is possible to address the issue of causality by using more advanced statistical techniques.
An Approach to Establishing Causality
This paper addresses the issue of causality between CO2 and temperature by following the research of the Nobel Laureate Clive Granger [1969], who defined causality in terms of whether lagged values of a variable lead to more accurate predictions of some other variable. In his words, “The definition of causality …is based entirely on the predictability of the some series, say Xt. If some other series Yt, contains information in past terms that helps in the prediction of Xt … then Yt is said to cause Xt.” [Granger, 1969, p 430]. This study embraces this view of causality by examining whether lagged values of CO2 lead to more accurate forecasts of temperature. The specific approach adopted here is to exploit the diurnal nature of the variation in the hourly CO2 concentration levels by using the CO2 concentration level in hour t – 24 as an explanatory variable. This variable has a 0.96 correlation with the CO2 level in hour t but i
From our climate panel in Grand Junction on August 4:
Our Forest, Our Water, Our Land: Local Impacts on Climate Change. Sponsored by Conservation Colorado, Mesa County Library, Math & Science Center
Jason Thompson helped Dr. Oliver Hemmers communicate why climate models fail.
Biography
Dr. Oliver Hemmers received his Ph.D. in physics in 1993 from the Technical University in Berlin, Germany, with specialization in x-ray atomic and molecular spectroscopy. Recent research focuses on developments of biofuels and new materials for hydrogen fuel storage. He currently manages a multiyear, multimillion-dollar biodiesel project funded by the U.S. Department of Energy. Over the past 10 years, he has been a principal investigator or co-PI on several research projects at UNLV totaling more than $6 million. Hemmers has made approximately 200 presentations at national and international meetings, published approximately 90 research articles, written one book, and holds one patent. He is a member of the American Physical Society and a reviewer for the American Institute of Physics and the Institute of Physics.
Jason Thompson is an alternative energy photojournalist who wrote more than 300 articles in Diesel Power which around 2010 was the #1 selling automotive magazine at Walmart. He now studies the visual framing of climate control from 1824 to the present.
Burntwood 2013 - Why climate models are the greatest feat of modern science, ...IES / IAQM
The IES 2013 Burntwood Lecture given by Julia Slingo from the Met Office on the topic: Why Climate Models are the greatest feat of modern science. #BWL13
09-28-17 Lifelong Learning Lecture: Jim HaynesEllsworth1835
"Natural and Human Causes of Climate Change: What Scientists Know and How They Know It"
Presented by James M. Haynes, PhD, Interim Provost and Vice President for Academic Affairs, Professor of Environmental Science and Ecology
For eons, six slowly, often intermittently acting natural forces have changed the Earth's temperature within a range of +7 degrees Fahrenheit, leading to climate swings from ice ages to planet-wide tropical conditions. Now, a seventh rapidly acting force is changing climate—modern human civilization. What evidences of climate change are observed today, and what is likely to happen to our children and generations beyond as a result of human activity in the recent past and today? What can we do to minimize the impacts of the changes to come?
This presentation created and addressed by Omar Bellprat (IC3 Barcelona) in the intensive three day course from the BC3, Basque Centre for Climate Change and UPV/EHU (University of the Basque Country) on Climate Change in the Uda Ikastaroak Framework.
The objective of the BC3 Summer School is to offer an updated and multidisciplinary view of the ongoing trends in climate change research. The BC3 Summer School is organized in collaboration with the University of the Basque Country and is a high quality and excellent summer course gathering leading experts in the field and students from top universities and research centres worldwide.
How to explain global warming The question of AttributionPazSilviapm
How to explain global warming?
The question of Attribution
You learned about the evidence that proves anthropogenic climate change is
taking place. Now, let’s talk about how we explain the phenomena of global
warming.
Previously, you viewed this figure from the IPCC’s assessment report, showing
various factors that contribute to climate change. The next slide will include
further detail about each forcing component.
This figure is also from the IPCC’s assessment report. LOSU means ‘level of
scientific understanding’. In this figure, two different forcing components are
shown; anthropogenic and natural forcings. It is important to remember that
not only anthropogenic forcings, natural forcings also drive climate change. For
example, glacial/Interglacial cycles we observed from the ice core samples
earlier this semester that recorded atmospheric conditions over last 450,000
years are clearly caused by natural forcings as we, homo sapiens, did not exist
that time!
In this figure, each radiative forcing is associated with a value (watts per square
meter) quantifying how much each forcing contributes to climate change. Some
forcings have a negative number (contribute to cooling), whereas others have a
positive number (contribute to warming). The total net forcing is currently a
positive value. Thus, the climate trend is currently warming.
IPCC report
As shown in the previous figure, natural forcing can change climate. The
dominant energy source to change Earth’s climate, the sun, also varies its
energy emission. This figure shows natural changes in solar irradiance from
1874 to 1988. Solar irradiance is the amount of energy per unit area received
from the Sun. In recent decades, solar activity has been measured by satellites,
while before it was estimated using a proxy variation. Without satellite
observation, energy differences were too small to detect.
Solar irradiance is higher during a period called “solar maximum”, which
appears almost every 11 years. During a solar maximum, interesting features
that appears on the Sun’s surface…
(continue)
Solar luminosity
Sunspot cycle (~11 year period,
~0.1% change in radiation
output)
(continued)
…are sunspots! Sunspots are relatively dark areas on the radiating surface of the
Sun, where intense magnetic activity inhibits convection and cools the
photosphere. Luminosity is the total amount of energy emitted by the Sun.
To summarize, more sunspot appears during a period of solar maximum, when the
Sun presents more intense magnetic activity (therefore higher luminosity).
Although solar irradiance was only recently measured by satellite, sunspots
have been observed for a very long time! The first such recording was made
by Galileo Galilei in the 17th century when he created the first telescope. In
addition, there are well documented historical records that captured solar
activity by Chinese astronomers. All records combined confirm ...
Recently, the machine learning community has expressed strong interest in applying latent variable modeling strategies to causal inference problems with unobserved confounding. Here, I discuss one of the big debates that occurred over the past year, and how we can move forward. I will focus specifically on the failure of point identification in this setting, and discuss how this can be used to design flexible sensitivity analyses that cleanly separate identified and unidentified components of the causal model.
I will discuss paradigmatic statistical models of inference and learning from high dimensional data, such as sparse PCA and the perceptron neural network, in the sub-linear sparsity regime. In this limit the underlying hidden signal, i.e., the low-rank matrix in PCA or the neural network weights, has a number of non-zero components that scales sub-linearly with the total dimension of the vector. I will provide explicit low-dimensional variational formulas for the asymptotic mutual information between the signal and the data in suitable sparse limits. In the setting of support recovery these formulas imply sharp 0-1 phase transitions for the asymptotic minimum mean-square-error (or generalization error in the neural network setting). A similar phase transition was analyzed recently in the context of sparse high-dimensional linear regression by Reeves et al.
Many different measurement techniques are used to record neural activity in the brains of different organisms, including fMRI, EEG, MEG, lightsheet microscopy and direct recordings with electrodes. Each of these measurement modes have their advantages and disadvantages concerning the resolution of the data in space and time, the directness of measurement of the neural activity and which organisms they can be applied to. For some of these modes and for some organisms, significant amounts of data are now available in large standardized open-source datasets. I will report on our efforts to apply causal discovery algorithms to, among others, fMRI data from the Human Connectome Project, and to lightsheet microscopy data from zebrafish larvae. In particular, I will focus on the challenges we have faced both in terms of the nature of the data and the computational features of the discovery algorithms, as well as the modeling of experimental interventions.
Bayesian Additive Regression Trees (BART) has been shown to be an effective framework for modeling nonlinear regression functions, with strong predictive performance in a variety of contexts. The BART prior over a regression function is defined by independent prior distributions on tree structure and leaf or end-node parameters. In observational data settings, Bayesian Causal Forests (BCF) has successfully adapted BART for estimating heterogeneous treatment effects, particularly in cases where standard methods yield biased estimates due to strong confounding.
We introduce BART with Targeted Smoothing, an extension which induces smoothness over a single covariate by replacing independent Gaussian leaf priors with smooth functions. We then introduce a new version of the Bayesian Causal Forest prior, which incorporates targeted smoothing for modeling heterogeneous treatment effects which vary smoothly over a target covariate. We demonstrate the utility of this approach by applying our model to a timely women's health and policy problem: comparing two dosing regimens for an early medical abortion protocol, where the outcome of interest is the probability of a successful early medical abortion procedure at varying gestational ages, conditional on patient covariates. We discuss the benefits of this approach in other women’s health and obstetrics modeling problems where gestational age is a typical covariate.
Difference-in-differences is a widely used evaluation strategy that draws causal inference from observational panel data. Its causal identification relies on the assumption of parallel trends, which is scale-dependent and may be questionable in some applications. A common alternative is a regression model that adjusts for the lagged dependent variable, which rests on the assumption of ignorability conditional on past outcomes. In the context of linear models, Angrist and Pischke (2009) show that the difference-in-differences and lagged-dependent-variable regression estimates have a bracketing relationship. Namely, for a true positive effect, if ignorability is correct, then mistakenly assuming parallel trends will overestimate the effect; in contrast, if the parallel trends assumption is correct, then mistakenly assuming ignorability will underestimate the effect. We show that the same bracketing relationship holds in general nonparametric (model-free) settings. We also extend the result to semiparametric estimation based on inverse probability weighting.
We develop sensitivity analyses for weak nulls in matched observational studies while allowing unit-level treatment effects to vary. In contrast to randomized experiments and paired observational studies, we show for general matched designs that over a large class of test statistics, any valid sensitivity analysis for the weak null must be unnecessarily conservative if Fisher's sharp null of no treatment effect for any individual also holds. We present a sensitivity analysis valid for the weak null, and illustrate why it is conservative if the sharp null holds through connections to inverse probability weighted estimators. An alternative procedure is presented that is asymptotically sharp if treatment effects are constant, and is valid for the weak null under additional assumptions which may be deemed reasonable by practitioners. The methods may be applied to matched observational studies constructed using any optimal without-replacement matching algorithm, allowing practitioners to assess robustness to hidden bias while allowing for treatment effect heterogeneity.
The world of health care is full of policy interventions: a state expands eligibility rules for its Medicaid program, a medical society changes its recommendations for screening frequency, a hospital implements a new care coordination program. After a policy change, we often want to know, “Did it work?” This is a causal question; we want to know whether the policy CAUSED outcomes to change. One popular way of estimating causal effects of policy interventions is a difference-in-differences study. In this controlled pre-post design, we measure the change in outcomes of people who are exposed to the new policy, comparing average outcomes before and after the policy is implemented. We contrast that change to the change over the same time period in people who were not exposed to the new policy. The differential change in the treated group’s outcomes, compared to the change in the comparison group’s outcomes, may be interpreted as the causal effect of the policy. To do so, we must assume that the comparison group’s outcome change is a good proxy for the treated group’s (counterfactual) outcome change in the absence of the policy. This conceptual simplicity and wide applicability in policy settings makes difference-in-differences an appealing study design. However, the apparent simplicity belies a thicket of conceptual, causal, and statistical complexity. In this talk, I will introduce the fundamentals of difference-in-differences studies and discuss recent innovations including key assumptions and ways to assess their plausibility, estimation, inference, and robustness checks.
We present recent advances and statistical developments for evaluating Dynamic Treatment Regimes (DTR), which allow the treatment to be dynamically tailored according to evolving subject-level data. Identification of an optimal DTR is a key component for precision medicine and personalized health care. Specific topics covered in this talk include several recent projects with robust and flexible methods developed for the above research area. We will first introduce a dynamic statistical learning method, adaptive contrast weighted learning (ACWL), which combines doubly robust semiparametric regression estimators with flexible machine learning methods. We will further develop a tree-based reinforcement learning (T-RL) method, which builds an unsupervised decision tree that maintains the nature of batch-mode reinforcement learning. Unlike ACWL, T-RL handles the optimization problem with multiple treatment comparisons directly through a purity measure constructed with augmented inverse probability weighted estimators. T-RL is robust, efficient and easy to interpret for the identification of optimal DTRs. However, ACWL seems more robust against tree-type misspecification than T-RL when the true optimal DTR is non-tree-type. At the end of this talk, we will also present a new Stochastic-Tree Search method called ST-RL for evaluating optimal DTRs.
A fundamental feature of evaluating causal health effects of air quality regulations is that air pollution moves through space, rendering health outcomes at a particular population location dependent upon regulatory actions taken at multiple, possibly distant, pollution sources. Motivated by studies of the public-health impacts of power plant regulations in the U.S., this talk introduces the novel setting of bipartite causal inference with interference, which arises when 1) treatments are defined on observational units that are distinct from those at which outcomes are measured and 2) there is interference between units in the sense that outcomes for some units depend on the treatments assigned to many other units. Interference in this setting arises due to complex exposure patterns dictated by physical-chemical atmospheric processes of pollution transport, with intervention effects framed as propagating across a bipartite network of power plants and residential zip codes. New causal estimands are introduced for the bipartite setting, along with an estimation approach based on generalized propensity scores for treatments on a network. The new methods are deployed to estimate how emission-reduction technologies implemented at coal-fired power plants causally affect health outcomes among Medicare beneficiaries in the U.S.
Laine Thomas presented information about how causal inference is being used to determine the cost/benefit of the two most common surgical surgical treatments for women - hysterectomy and myomectomy.
We provide an overview of some recent developments in machine learning tools for dynamic treatment regime discovery in precision medicine. The first development is a new off-policy reinforcement learning tool for continual learning in mobile health to enable patients with type 1 diabetes to exercise safely. The second development is a new inverse reinforcement learning tools which enables use of observational data to learn how clinicians balance competing priorities for treating depression and mania in patients with bipolar disorder. Both practical and technical challenges are discussed.
The method of differences-in-differences (DID) is widely used to estimate causal effects. The primary advantage of DID is that it can account for time-invariant bias from unobserved confounders. However, the standard DID estimator will be biased if there is an interaction between history in the after period and the groups. That is, bias will be present if an event besides the treatment occurs at the same time and affects the treated group in a differential fashion. We present a method of bounds based on DID that accounts for an unmeasured confounder that has a differential effect in the post-treatment time period. These DID bracketing bounds are simple to implement and only require partitioning the controls into two separate groups. We also develop two key extensions for DID bracketing bounds. First, we develop a new falsification test to probe the key assumption that is necessary for the bounds estimator to provide consistent estimates of the treatment effect. Next, we develop a method of sensitivity analysis that adjusts the bounds for possible bias based on differences between the treated and control units from the pretreatment period. We apply these DID bracketing bounds and the new methods we develop to an application on the effect of voter identification laws on turnout. Specifically, we focus estimating whether the enactment of voter identification laws in Georgia and Indiana had an effect on voter turnout.
We study experimental design in large-scale stochastic systems with substantial uncertainty and structured cross-unit interference. We consider the problem of a platform that seeks to optimize supply-side payments p in a centralized marketplace where different suppliers interact via their effects on the overall supply-demand equilibrium, and propose a class of local experimentation schemes that can be used to optimize these payments without perturbing the overall market equilibrium. We show that, as the system size grows, our scheme can estimate the gradient of the platform’s utility with respect to p while perturbing the overall market equilibrium by only a vanishingly small amount. We can then use these gradient estimates to optimize p via any stochastic first-order optimization method. These results stem from the insight that, while the system involves a large number of interacting units, any interference can only be channeled through a small number of key statistics, and this structure allows us to accurately predict feedback effects that arise from global system changes using only information collected while remaining in equilibrium.
We discuss a general roadmap for generating causal inference based on observational studies used to general real world evidence. We review targeted minimum loss estimation (TMLE), which provides a general template for the construction of asymptotically efficient plug-in estimators of a target estimand for realistic (i.e, infinite dimensional) statistical models. TMLE is a two stage procedure that first involves using ensemble machine learning termed super-learning to estimate the relevant stochastic relations between the treatment, censoring, covariates and outcome of interest. The super-learner allows one to fully utilize all the advances in machine learning (in addition to more conventional parametric model based estimators) to build a single most powerful ensemble machine learning algorithm. We present Highly Adaptive Lasso as an important machine learning algorithm to include.
In the second step, the TMLE involves maximizing a parametric likelihood along a so-called least favorable parametric model through the super-learner fit of the relevant stochastic relations in the observed data. This second step bridges the state of the art in machine learning to estimators of target estimands for which statistical inference is available (i.e, confidence intervals, p-values etc). We also review recent advances in collaborative TMLE in which the fit of the treatment and censoring mechanism is tailored w.r.t. performance of TMLE. We also discuss asymptotically valid bootstrap based inference. Simulations and data analyses are provided as demonstrations.
We describe different approaches for specifying models and prior distributions for estimating heterogeneous treatment effects using Bayesian nonparametric models. We make an affirmative case for direct, informative (or partially informative) prior distributions on heterogeneous treatment effects, especially when treatment effect size and treatment effect variation is small relative to other sources of variability. We also consider how to provide scientifically meaningful summaries of complicated, high-dimensional posterior distributions over heterogeneous treatment effects with appropriate measures of uncertainty.
Climate change mitigation has traditionally been analyzed as some version of a public goods game (PGG) in which a group is most successful if everybody contributes, but players are best off individually by not contributing anything (i.e., “free-riding”)—thereby creating a social dilemma. Analysis of climate change using the PGG and its variants has helped explain why global cooperation on GHG reductions is so difficult, as nations have an incentive to free-ride on the reductions of others. Rather than inspire collective action, it seems that the lack of progress in addressing the climate crisis is driving the search for a “quick fix” technological solution that circumvents the need for cooperation.
This seminar discussed ways in which to produce professional academic writing, from academic papers to research proposals or technical writing in general.
Machine learning (including deep and reinforcement learning) and blockchain are two of the most noticeable technologies in recent years. The first one is the foundation of artificial intelligence and big data, and the second one has significantly disrupted the financial industry. Both technologies are data-driven, and thus there are rapidly growing interests in integrating them for more secure and efficient data sharing and analysis. In this paper, we review the research on combining blockchain and machine learning technologies and demonstrate that they can collaborate efficiently and effectively. In the end, we point out some future directions and expect more researches on deeper integration of the two promising technologies.
In this talk, we discuss QuTrack, a Blockchain-based approach to track experiment and model changes primarily for AI and ML models. In addition, we discuss how change analytics can be used for process improvement and to enhance the model development and deployment processes.
More from The Statistical and Applied Mathematical Sciences Institute (20)
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
Nutraceutical market, scope and growth: Herbal drug technology
CLIM Undergraduate Workshop: Applications in Climate Context - Michael Wehner, Oct 23, 2017
1. What do climate statisticians do?
SAMSI
October 23, 2017
Michael F. Wehner
Lawrence Berkeley National Laboratory
mfwehner@lbl.gov
2. US DOE Policy 411.2A
SUBJECT: SCIENTIFIC INTEGRITY
When expressing opinions on policy matters to the public and media,
research personnel must make it clear when they are expressing their
personal views, rather than those of the Department, the U.S.
Government, or their respective institutions. Public representation of
Government or DOE positions or policies must be cleared through
their program management to include DOE headquarters.
3. Climate is what you expect
weather is what you get!
Ed Lorenz
or perhaps Robert Heinlein…
7. We have known about the greenhouse effect for more than 150 years.
It is steam engine science.
Tyndal measured the radiative absorbtive properties of many gases.
“The atmosphere admits of the entrance of the
solar heat, but checks its exit; and the result is a
tendency to accumulate heat at the surface of the
planet.”
-- John Tyndall, 1859
This is not rocket science
O2, N2 H2O, CO2,N2O CH4
Low ,Medium High
8. “Doubling of CO2 would raise surface
temperature by 5-6 °C, or 9-11 °F, above
pre-industrial temperatures.”
-- Svante Arrhenius,1896
Quantum mechanics
We now call the climate system’s response to doubling CO2
“The equilibrium climate sensitivity”.
1896: 5-6 oC (Arrhenius)
2013: 2-6 oC (Intergovernmental Panel on Climate Change)
9. Humans are changing the atmospheric composition
400ppm: CO2 in the atmosphere higher than in the last 600,000 years.
Charles Keeling: Mauna Loa
Ocean is becoming more acidic
10. No fooling:
Warming is “unequivocal”
Global mean surface air
temperature is rapidly increasing.
Is the human change to the
composition to the atmosphere
responsible?
11. Not all pollutants have the same “forcing” effect.
ABILITY TO
ABSORB
RADIATION
THE AMOUNT
EMITTED MATTERS
Black Carbon
(soot)
Carbon
Dioxide
Nitrous
Oxide
Methane
Credit: I. Ocko, EDF
Greenhouse gases
absorb sunlight energy
Some particles absorb
sunlight energy
Some particles reflect
sunlight energy
0
Sulfate Aerosols
(Acid Rain)
12. Detection and Attribution (D&A for short):
• Detection: Identify statistically significant trends in (usually) sparse
observational records.
– In situ data. Sometimes long temporal records, never enough spatial
coverage
– Satellite data: Complete spatial coverage, but limited in time. Records
start in 1979 or later.
• Attribution: Quantify the human contribution, if any, to observed climate
changes.
– Separate forced changes (signals) from natural variations (noise)
– Quantify the effects of different forcing agents.
– Natural: Solar variations, volcanoes
– Anthropogenic:
• Well mixed greenhouse gases (CO2,CH4, NO2, etc)
• Sulfate and other short lived aerosols
• Ozone
• Land use changes.
What do climate statisticians do?
13. • Observe a trend.
• Do an experiment.
• Run some climate models with and without human forcing agents
• Use simple linear regression statistical models to describe the observations
X=b*Y+x
• X=observations
• Y=climate model
• x=noise
• b=scaling factor
• Does the uncertainty range of b include zero?
• If yes, then we do not attribute the observed change to human activities.
• If no, then we do attribute the observed change to human activities!
Detection and attribution
14. “It is extremely likely that more than half of the global mean temperature
increase since 1951 was caused by human influence on climate (high
confidence). The likely contributions of natural forcing and internal variability
to global temperature change over that period are minor (high confidence).”
--Chapter 3, Key Finding #1
The most definitive attribution statement in a US NCA
15. “The likely range of the human contribution to the global mean temperature
increase over the period 1951–2010 is 1.1° to 1.4°F (0.6° to 0.8°C), and the
central estimate of the observed warming of 1.2°F (0.65°C) lies within this
range (high confidence). This translates to a likely human contribution of
93%–123% of the observed 1951–2010 change.”
The most definitive attribution statement in a US NCA
16. Extreme weather
• The tail of the distribution of all weather.
• But…
What else do climate statisticians do?
17. Categorizing an event as “extreme” is a somewhat arbitrary procedure.
• What is extreme at one space and time may be typical at another.
• Extremes are at the tails of the distribution. How is “tail” defined?
• Does extreme mean “rare” or simply high impact?
What else do climate statisticians do?
• Statisticians have given us an “asymptotic”
theory to describe the tails of distributions.
• The distribution of extremes.
• The distribution of the tail.
• Huh?
• Peaks over a high threshold.
• Block maxima.
18. • Extract the annual maximum temperature from the entire daily time series.
• This subset is a “block maxima” sample.
• Under the right conditions, the underlying distribution may be described by a
three parameter function known as the “Generalized Extreme Value” (GEV)
distribution.
• The conditions:
– Stationary.
– In the “asymptotic regime”
– i.i.d.
– Other things I don’t worry about.
Annual maximum daily temperatures
F(x) =
e
- 1-k(x-x )/a[ ]1/k
k ¹ 0
e-e-(x-x )/a
k = 0
ì
í
ïï
î
ï
ï
x = location
a = scale
k = shape
19. If shape parameter is negative*, the distribution is bounded.
If shape parameter is positive*, the distribution is unbounded.
Properties of the GEV
By R D Gill - Created by R D Gill, 4 January 2013, using R script, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=45005894
* Beware sign
conventions on the sign
of the shape parameter
GEV Probability density function (pdf)
20. GEV Return Value
If you can fit the GEV to the extreme
sample, you can then calculate useful
physically useful properties
The return value of a random variable, XT
is that value which is exceeded, on
average, once in a period of time, T
=
=
0))/11ln(ln(
0/)}/11ln({1[
kT
kkT
X
k
T
x
x
1 10 100
Return Time (years)
296
298
300
302
304
306
308
310
312
314
316
ReturnValue(kelvins)
1 10 100
23. Uncertainty quantification.
• There are many sources of uncertainty.
1. Different models have different sensitivities to external forcings.
– We have a rather broad range for an estimate of the true “climate sensitivity”.
2. We don’t know how much more CO2 society will permit to be emitted.
3. We don’t know the precise state of the climate system due to observational
constraints.
– The climate system is chaotic.
Another important thing that climate statisticians do:
Hawkins and Sutton BAMS
24. It is firmly established that global warming is real and that
humans are the cause.
• But there are many details that we can do better.
• Advances in high performance computing are improving simulations of extreme
storms, including hurricanes (movie).
• Statistical description of extreme precipitation usually leads to unbounded GEV
distributions. Why?
– Sample size limitations? Probably no.
– Mixing distributions across storm types? Probably yes. Violate i.i.d.
– Not in the asymptotic regime? Yes for block maxima in the deserts.
• Non-stationarity. Duh, its climate change!
– Covariate methods are a powerful way to incorporate known physical
behavior into the extreme value statistics. An active research area.
• Machine learning. Just scratching the surface.
– Berkeley Lab is using convolutional neural nets as a storm tracking tool.
Current topics