This study analyzes the temperature history of 24 American cities going back to 1895. Using a LOESS model, it forecasts prospective temperature increases over the next 40 years and out to 2100. And, it compares the 2100 forecast with the NOAA model(s). This comparison uncovers serious deficiencies within the NOAA model(s), as it does not fit the historical data well; and it does not differentiate much forecasts between various cities.
It is widely promulgated and believed that human-caused global warming comes with increases in both the
intensity and frequency of extreme weather events. A survey of official weather sites and the scientific literature
provides strong evidence that the first half of the 20th century had more extreme weather than the second half,
when anthropogenic global warming is claimed to have been mainly responsible for observed climate change. The
disconnect between real-world historical data on the 100 years’ time scale and the current predictions provides a
real conundrum when any engineer tries to make a professional assessment of the real future value of any
infrastructure project which aims to mitigate or adapt to climate change. What is the appropriate basis on which to
make judgements when theory and data are in such disagreement?
"Climate Change 2007: The Physical Science Basis", assesses the current
scientific knowledge of the natural and human drivers of climate change,
observed changes in climate, the ability of science to attribute changes
to different causes, and projections for future climate change.
The report was produced by some 600 authors from 40 countries. Over 620
expert reviewers and a large number of government reviewers also
participated. Representatives from 113 governments reviewed and revised
the Summary line-by-line during the course of this week before adopting
it and accepting the underlying report.
“Climate Change
2007: The Physical Science Basis”, assesses the current scientific knowledge of
the natural and human drivers of climate change, observed changes in climate,
the ability of science to attribute changes to different causes, and projections
for future climate change.
The report was
produced by some 600 authors from 40 countries. Over 620 expert reviewers and a
large number of government reviewers also participated. Representatives from 113
governments reviewed and revised the Summary line-by-line during the course of
this week before adopting it and accepting the underlying
report.
It is widely promulgated and believed that human-caused global warming comes with increases in both the
intensity and frequency of extreme weather events. A survey of official weather sites and the scientific literature
provides strong evidence that the first half of the 20th century had more extreme weather than the second half,
when anthropogenic global warming is claimed to have been mainly responsible for observed climate change. The
disconnect between real-world historical data on the 100 years’ time scale and the current predictions provides a
real conundrum when any engineer tries to make a professional assessment of the real future value of any
infrastructure project which aims to mitigate or adapt to climate change. What is the appropriate basis on which to
make judgements when theory and data are in such disagreement?
"Climate Change 2007: The Physical Science Basis", assesses the current
scientific knowledge of the natural and human drivers of climate change,
observed changes in climate, the ability of science to attribute changes
to different causes, and projections for future climate change.
The report was produced by some 600 authors from 40 countries. Over 620
expert reviewers and a large number of government reviewers also
participated. Representatives from 113 governments reviewed and revised
the Summary line-by-line during the course of this week before adopting
it and accepting the underlying report.
“Climate Change
2007: The Physical Science Basis”, assesses the current scientific knowledge of
the natural and human drivers of climate change, observed changes in climate,
the ability of science to attribute changes to different causes, and projections
for future climate change.
The report was
produced by some 600 authors from 40 countries. Over 620 expert reviewers and a
large number of government reviewers also participated. Representatives from 113
governments reviewed and revised the Summary line-by-line during the course of
this week before adopting it and accepting the underlying
report.
How to explain global warming The question of AttributionPazSilviapm
How to explain global warming?
The question of Attribution
You learned about the evidence that proves anthropogenic climate change is
taking place. Now, let’s talk about how we explain the phenomena of global
warming.
Previously, you viewed this figure from the IPCC’s assessment report, showing
various factors that contribute to climate change. The next slide will include
further detail about each forcing component.
This figure is also from the IPCC’s assessment report. LOSU means ‘level of
scientific understanding’. In this figure, two different forcing components are
shown; anthropogenic and natural forcings. It is important to remember that
not only anthropogenic forcings, natural forcings also drive climate change. For
example, glacial/Interglacial cycles we observed from the ice core samples
earlier this semester that recorded atmospheric conditions over last 450,000
years are clearly caused by natural forcings as we, homo sapiens, did not exist
that time!
In this figure, each radiative forcing is associated with a value (watts per square
meter) quantifying how much each forcing contributes to climate change. Some
forcings have a negative number (contribute to cooling), whereas others have a
positive number (contribute to warming). The total net forcing is currently a
positive value. Thus, the climate trend is currently warming.
IPCC report
As shown in the previous figure, natural forcing can change climate. The
dominant energy source to change Earth’s climate, the sun, also varies its
energy emission. This figure shows natural changes in solar irradiance from
1874 to 1988. Solar irradiance is the amount of energy per unit area received
from the Sun. In recent decades, solar activity has been measured by satellites,
while before it was estimated using a proxy variation. Without satellite
observation, energy differences were too small to detect.
Solar irradiance is higher during a period called “solar maximum”, which
appears almost every 11 years. During a solar maximum, interesting features
that appears on the Sun’s surface…
(continue)
Solar luminosity
Sunspot cycle (~11 year period,
~0.1% change in radiation
output)
(continued)
…are sunspots! Sunspots are relatively dark areas on the radiating surface of the
Sun, where intense magnetic activity inhibits convection and cools the
photosphere. Luminosity is the total amount of energy emitted by the Sun.
To summarize, more sunspot appears during a period of solar maximum, when the
Sun presents more intense magnetic activity (therefore higher luminosity).
Although solar irradiance was only recently measured by satellite, sunspots
have been observed for a very long time! The first such recording was made
by Galileo Galilei in the 17th century when he created the first telescope. In
addition, there are well documented historical records that captured solar
activity by Chinese astronomers. All records combined confirm ...
Climate Change Effects on Dengue Fever and Chagas' DiseaseAbigail Lukowicz
Undergraduate capstone project for the class Ecology of Infectious Diseases. This research highlights potential effects of climate change on the Dengue Fever vector (Aedes aegypti) and the Chagas' disease vector (Triatomine spp.). Collaboration with Michael Andreone and Daniel Pastika.
IPCC 2013 report on Climate Change - The Physical BasisGreenFacts
"Climate Change 2013: The Physical Science Basis" is a comprehensive assessment of the physical aspects of climate change, which puts a focus on the elements that are relevant to understand past, document current, and project future climate change.
The report covers observations of changes in all components of the climate system and assess the current knowledge of various processes of the climate system.
Direct global-scale instrumental observation of the climate began in the middle of the 19th century, and reconstruction of the climate using proxies such as tree rings or the content of sediment layers extends the record much further in the past.
The present assessment uses a new set of new scenarios to explore the future impacts of climate change under a range of different possible emission pathways.
This is the slidshow that I use for climate change extension. I am currently involved in the National Drought Pilot Program, giving the overview of climate, climate change and agronomic decisions related to it. There is a lot I discuss that isn\'t in the slides, but these highlight my main points, which end at the "what have we learnt" slide.
CSCR Agriculture Track w/ Larry Klotz: Weather or Not - Effects of Changing W...Sustainable Tompkins
Climate Smart & Climate Ready Conference Agriculture Track on April 19, 2013 at NYS Grange in Cortland, NY. Prof. Larry Klotz, SUNY Cortland. Weather or Not: Effects of Changing Weather on Local Agriculture. What is climate change? What are regional implications?
Global warming &climate changesGlobal temperature measurements remote from human habitation and activity show no evidence of a warming during the last century. Such sites include “proxy” measurements such as tree rings, marine sediments and ice cores, weather balloons and satellite measurements in the lower atmosphere, and many surface sites where human influence is minimal.
Chapter
Climate Change 2014
Synthesis Report
Summary for Policymakers
Summary for Policymakers
2
SPM
Introduction
This Synthesis Report is based on the reports of the three Working Groups of the Intergovernmental Panel on Climate Change
(IPCC), including relevant Special Reports. It provides an integrated view of climate change as the final part of the IPCC’s
Fifth Assessment Report (AR5).
This summary follows the structure of the longer report which addresses the following topics: Observed changes and their
causes; Future climate change, risks and impacts; Future pathways for adaptation, mitigation and sustainable development;
Adaptation and mitigation.
In the Synthesis Report, the certainty in key assessment findings is communicated as in the Working Group Reports and
Special Reports. It is based on the author teams’ evaluations of underlying scientific understanding and is expressed as a
qualitative level of confidence (from very low to very high) and, when possible, probabilistically with a quantified likelihood
(from exceptionally unlikely to virtually certain)1. Where appropriate, findings are also formulated as statements of fact with-
out using uncertainty qualifiers.
This report includes information relevant to Article 2 of the United Nations Framework Convention on Climate Change
(UNFCCC).
SPM 1. Observed Changes and their Causes
Human influence on the climate system is clear, and recent anthropogenic emissions of green-
house gases are the highest in history. Recent climate changes have had widespread impacts
on human and natural systems. {1}
SPM 1.1 Observed changes in the climate system
Warming of the climate system is unequivocal, and since the 1950s, many of the observed
changes are unprecedented over decades to millennia. The atmosphere and ocean have
warmed, the amounts of snow and ice have diminished, and sea level has risen. {1.1}
Each of the last three decades has been successively warmer at the Earth’s surface than any preceding decade since 1850. The
period from 1983 to 2012 was likely the warmest 30-year period of the last 1400 years in the Northern Hemisphere, where
such assessment is possible (medium confidence). The globally averaged combined land and ocean surface temperature
data as calculated by a linear trend show a warming of 0.85 [0.65 to 1.06] °C 2 over the period 1880 to 2012, when multiple
independently produced datasets exist (Figure SPM.1a). {1.1.1, Figure 1.1}
In addition to robust multi-decadal warming, the globally averaged surface temperature exhibits substantial decadal and
interannual variability (Figure SPM.1a). Due to this natural variability, trends based on short records are very sensitive to the
beginning and end dates and do not in general reflect long-term climate trends. As one example, the rate of warming over
1 Each finding is grounded in an evaluation of underlying evidence and agreement. In many cases, a synthesis of evidence and agreement suppo.
Sacramento's population projections for the State of California are already 1.4 million too high only 3 years into the forecast by 2023. The reason is Sacramento's unrealistic migration assumption. This analysis tests in detail how and why this projection went so wrong.
Compact Letter Display (CLD). How it worksGaetan Lion
Compact Letter Display (CLD) renders ANOVA & Tukey HSD testing a lot easier to interpret. It readily ranks and differentiate the tested variables. With CLD you can readily identify the variables that are statistically dissimilar vs. the ones that are similar.
How to explain global warming The question of AttributionPazSilviapm
How to explain global warming?
The question of Attribution
You learned about the evidence that proves anthropogenic climate change is
taking place. Now, let’s talk about how we explain the phenomena of global
warming.
Previously, you viewed this figure from the IPCC’s assessment report, showing
various factors that contribute to climate change. The next slide will include
further detail about each forcing component.
This figure is also from the IPCC’s assessment report. LOSU means ‘level of
scientific understanding’. In this figure, two different forcing components are
shown; anthropogenic and natural forcings. It is important to remember that
not only anthropogenic forcings, natural forcings also drive climate change. For
example, glacial/Interglacial cycles we observed from the ice core samples
earlier this semester that recorded atmospheric conditions over last 450,000
years are clearly caused by natural forcings as we, homo sapiens, did not exist
that time!
In this figure, each radiative forcing is associated with a value (watts per square
meter) quantifying how much each forcing contributes to climate change. Some
forcings have a negative number (contribute to cooling), whereas others have a
positive number (contribute to warming). The total net forcing is currently a
positive value. Thus, the climate trend is currently warming.
IPCC report
As shown in the previous figure, natural forcing can change climate. The
dominant energy source to change Earth’s climate, the sun, also varies its
energy emission. This figure shows natural changes in solar irradiance from
1874 to 1988. Solar irradiance is the amount of energy per unit area received
from the Sun. In recent decades, solar activity has been measured by satellites,
while before it was estimated using a proxy variation. Without satellite
observation, energy differences were too small to detect.
Solar irradiance is higher during a period called “solar maximum”, which
appears almost every 11 years. During a solar maximum, interesting features
that appears on the Sun’s surface…
(continue)
Solar luminosity
Sunspot cycle (~11 year period,
~0.1% change in radiation
output)
(continued)
…are sunspots! Sunspots are relatively dark areas on the radiating surface of the
Sun, where intense magnetic activity inhibits convection and cools the
photosphere. Luminosity is the total amount of energy emitted by the Sun.
To summarize, more sunspot appears during a period of solar maximum, when the
Sun presents more intense magnetic activity (therefore higher luminosity).
Although solar irradiance was only recently measured by satellite, sunspots
have been observed for a very long time! The first such recording was made
by Galileo Galilei in the 17th century when he created the first telescope. In
addition, there are well documented historical records that captured solar
activity by Chinese astronomers. All records combined confirm ...
Climate Change Effects on Dengue Fever and Chagas' DiseaseAbigail Lukowicz
Undergraduate capstone project for the class Ecology of Infectious Diseases. This research highlights potential effects of climate change on the Dengue Fever vector (Aedes aegypti) and the Chagas' disease vector (Triatomine spp.). Collaboration with Michael Andreone and Daniel Pastika.
IPCC 2013 report on Climate Change - The Physical BasisGreenFacts
"Climate Change 2013: The Physical Science Basis" is a comprehensive assessment of the physical aspects of climate change, which puts a focus on the elements that are relevant to understand past, document current, and project future climate change.
The report covers observations of changes in all components of the climate system and assess the current knowledge of various processes of the climate system.
Direct global-scale instrumental observation of the climate began in the middle of the 19th century, and reconstruction of the climate using proxies such as tree rings or the content of sediment layers extends the record much further in the past.
The present assessment uses a new set of new scenarios to explore the future impacts of climate change under a range of different possible emission pathways.
This is the slidshow that I use for climate change extension. I am currently involved in the National Drought Pilot Program, giving the overview of climate, climate change and agronomic decisions related to it. There is a lot I discuss that isn\'t in the slides, but these highlight my main points, which end at the "what have we learnt" slide.
CSCR Agriculture Track w/ Larry Klotz: Weather or Not - Effects of Changing W...Sustainable Tompkins
Climate Smart & Climate Ready Conference Agriculture Track on April 19, 2013 at NYS Grange in Cortland, NY. Prof. Larry Klotz, SUNY Cortland. Weather or Not: Effects of Changing Weather on Local Agriculture. What is climate change? What are regional implications?
Global warming &climate changesGlobal temperature measurements remote from human habitation and activity show no evidence of a warming during the last century. Such sites include “proxy” measurements such as tree rings, marine sediments and ice cores, weather balloons and satellite measurements in the lower atmosphere, and many surface sites where human influence is minimal.
Chapter
Climate Change 2014
Synthesis Report
Summary for Policymakers
Summary for Policymakers
2
SPM
Introduction
This Synthesis Report is based on the reports of the three Working Groups of the Intergovernmental Panel on Climate Change
(IPCC), including relevant Special Reports. It provides an integrated view of climate change as the final part of the IPCC’s
Fifth Assessment Report (AR5).
This summary follows the structure of the longer report which addresses the following topics: Observed changes and their
causes; Future climate change, risks and impacts; Future pathways for adaptation, mitigation and sustainable development;
Adaptation and mitigation.
In the Synthesis Report, the certainty in key assessment findings is communicated as in the Working Group Reports and
Special Reports. It is based on the author teams’ evaluations of underlying scientific understanding and is expressed as a
qualitative level of confidence (from very low to very high) and, when possible, probabilistically with a quantified likelihood
(from exceptionally unlikely to virtually certain)1. Where appropriate, findings are also formulated as statements of fact with-
out using uncertainty qualifiers.
This report includes information relevant to Article 2 of the United Nations Framework Convention on Climate Change
(UNFCCC).
SPM 1. Observed Changes and their Causes
Human influence on the climate system is clear, and recent anthropogenic emissions of green-
house gases are the highest in history. Recent climate changes have had widespread impacts
on human and natural systems. {1}
SPM 1.1 Observed changes in the climate system
Warming of the climate system is unequivocal, and since the 1950s, many of the observed
changes are unprecedented over decades to millennia. The atmosphere and ocean have
warmed, the amounts of snow and ice have diminished, and sea level has risen. {1.1}
Each of the last three decades has been successively warmer at the Earth’s surface than any preceding decade since 1850. The
period from 1983 to 2012 was likely the warmest 30-year period of the last 1400 years in the Northern Hemisphere, where
such assessment is possible (medium confidence). The globally averaged combined land and ocean surface temperature
data as calculated by a linear trend show a warming of 0.85 [0.65 to 1.06] °C 2 over the period 1880 to 2012, when multiple
independently produced datasets exist (Figure SPM.1a). {1.1.1, Figure 1.1}
In addition to robust multi-decadal warming, the globally averaged surface temperature exhibits substantial decadal and
interannual variability (Figure SPM.1a). Due to this natural variability, trends based on short records are very sensitive to the
beginning and end dates and do not in general reflect long-term climate trends. As one example, the rate of warming over
1 Each finding is grounded in an evaluation of underlying evidence and agreement. In many cases, a synthesis of evidence and agreement suppo.
Sacramento's population projections for the State of California are already 1.4 million too high only 3 years into the forecast by 2023. The reason is Sacramento's unrealistic migration assumption. This analysis tests in detail how and why this projection went so wrong.
Compact Letter Display (CLD). How it worksGaetan Lion
Compact Letter Display (CLD) renders ANOVA & Tukey HSD testing a lot easier to interpret. It readily ranks and differentiate the tested variables. With CLD you can readily identify the variables that are statistically dissimilar vs. the ones that are similar.
This study compares the benefits and the funding for CalPERS pensions vs. Social Security. It also looks in more detail on the financial burden of CalPERS pensions on the Marin Municipal Water District.
This presentation includes two explanatory models to attempt to predict recessions. The first one is a logistic regression. The second one is a deep neural network (DNN). Both use the same set of independent variables: the velocity of money, inflation, the yield curve, and the stock market. As usual, the DNN fits the historical data a bit better than the simpler logistic regression. But, when it comes to testing or predicting, both models are pretty much even.
Objective:
Studying trends in US inequality along several social dimensions including education, ethnicity, percentiles, and work status. We don’t explore gender because it is not disaggregated within the mentioned data that focuses on families (fairly similar to households).
Data source:
US Government Survey of Consumer Finance (SCF) data. The SCF aggregates financial data on US families every three years. And, it discloses a time series from 1989 to 2019.
The model development two objectives are:
1) To explain home prices using demographic explanatory variables; and
2) To benchmark the accuracy of OLS regressions vs. DNN models.
For home prices, we used county level data from Zillow. For the explanatory variables, we used data from GEOFRED.
This analysis focuses on population aging, population age categories in % (age pyramids), and overall population growth. It looks at various geographic units (countries, continents, regions, World) from 1950 to the Present (2019 & 2020). And, it looks at projections out to 2100.
Africa is an outlier to the overall global aging; its population growth (historical & projected) is far faster than for other major regions.
We are going to analyze several of the major cryptocurrencies as an asset class. And, we are going to address several related questions:
Do they provide diversification benefits relative to the stock market (S&P 500)?
How do their diversification benefits compare with Gold’s diversification benefit vs. the stock market?
Do cryptocurrencies provide diversification benefits when you really need it… during market downturns?
Are cryptocurrencies truly “digital Gold”? Do they behave in a similar way given that their supply is constrained (supposedly in a similar way as Gold is)?
We will test whether :
a) Sequential Deep Neural Networks (DNNs) can predict the stock market (S&P 500) better than OLS regression;
b) DNNs using smooth Rectified Linear activation functions perform better than the ones using Sigmoid (Logit) activation functions.
Can Treasury Inflation Protected Securities predict Inflation?Gaetan Lion
We look at the spread between Treasuries and TIPS to figure out how effective such observations were in predicting actual inflation several years down the road.
This analysis focuses on measures much beyond PE ratios. And, it concludes that the Stock Market is actually really cheap vs. bonds. But, it appears quite overvalued when focusing on inflation measures.
The relationship between the Stock Market and Interest RatesGaetan Lion
This is a study of the relationship between the Stock Market and Interest Rates. We review how the Stock Market has reacted when interest rates rise. We also factor the influence of other macroeconomics variables.
This is a study using historical data and forecasts of life expectancy for several countries. The data and forecasts come from the UN - Population Division. While the historical data is most interesting, the forecasts are highly optimistic as they project a linear trend way into the future. Meanwhile, those forecasts should have followed a much more realistic logarithmic curve reflecting slower increase in life expectancy as the life expectancy rises.
Will Stock Markets survive in 200 years?Gaetan Lion
This study uncovers 11 international stock markets that are already running into existing and prospective demographic and economic growth constraints. This study evaluates their respective fragile long term viability and the implications this has for the investors in such countries.
This study answers three questions:
1) Does it make a difference whether you standardize your variables before running your model or standardize the regression coefficients after you run your model?
2) Does the scale of the respective original non-standardized variables affect the resulting standardized coefficients?
3) Does using non-standardized variables vs. standardized variables have an impact when conducting regularization (Ridge Regression, LASSO)?
This analysis compares his track record vs. Manning, Montana, Marino, Brees, Favre, and Elway. At the end of this analysis, it makes extensive use of the binomial distribution to figure out how much of their respective track records are due to randomness vs. skills.
Regularization why you should avoid themGaetan Lion
Regularization models are supposed to reduce model over-fitting and improve forecasting accuracy. Very often they do just the opposite: increase model under-fitting, and decrease model forecasting accuracy. This study explains how Regularization models often fail, and how to resolve model issues with far simpler and more robust methods.
This study reviews the increasing prevalence of 3-shot points within the NBA. It also compares the record of the 5 top players in NBA history in 3-pt shots. It also considers how many good years left Curry may have.
Japan vs. US comparison on numerous dimensionsGaetan Lion
This study compares Japan vs. the US on numerous dimensions including demographics (including health and education), and economics (including monetary and fiscal policies). This is to observe when Japan and the US trends are likely to converge over time.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Studia Poinsotiana
I Introduction
II Subalternation and Theology
III Theology and Dogmatic Declarations
IV The Mixed Principles of Theology
V Virtual Revelation: The Unity of Theology
VI Theology as a Natural Science
VII Theology’s Certitude
VIII Conclusion
Notes
Bibliography
All the contents are fully attributable to the author, Doctor Victor Salas. Should you wish to get this text republished, get in touch with the author or the editorial committee of the Studia Poinsotiana. Insofar as possible, we will be happy to broker your contact.
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
2. Climate Change at the US City level
Mining the climate data from the National Oceanic and Atmospheric Administration (NOAA), I uncovered
24 cities that have yearly temperature records going back to 1895.
The temperature data is extremely volatile. In order to uncover the underlying trend of such data, and
smooth out the volatility, I took the following steps:
1) I calculated the 10 year moving average on a yearly basis. So the first data point is in 1904 reflecting
the average temperature over the 1895 to 1904 period;
2) I took the 10 year difference in such temperatures as calculated in the first step. So, now the first data
point is in 1914 reflecting the decadal difference in temperature between 1914 and 1904;
3) Next, I used a LOESS Regression that instead of fitting a straight regression trend line, fits a curving line
that fits this non linear data set much better. LOESS allows one to observe the changing trend over
time (see M vs. V patterns later in this presentation) and the most relevant recent trend over the most
recent 10 years.
2
3. The Cities with data going back to 1895
24 cities from 20 different States, including
three in New York and four in Texas.
3
4. 4
How does the Intergovernmental Panel for Climate Change (IPCC) look at
temperature?
The IPCC looks at global yearly temperature data as the anomaly or increase over the
average temperature during the 1850 – 1900 period.
It indicates that current temperatures are about 1 degree Celsius above that level. And,
that to maintain a relatively unimpaired environment we should maintain this temperature
increase to =< 1.5 degree Celsius by the end of this century. This seems extremely
challenging at best.
The IPCC indicates that a rise in temperature greater than 2 degrees would be already
somewhat adverse for the environment. And, an increase near or above 3 degrees could be
even more adverse. The IPCC has generated scenarios where temperatures could
potentially increase to 4, 5, or even 6 degrees Celsius by 2100. And, such temperature
increases would be associated with progressively more devastating impact on the
environment.
5. 5
A look at US cities temperature
history
Focusing on the 24 US cities, I used as a baseline the average
over the 1895 – 1920 period because the NOAA at the city
level does not go further back than 1895. Also, the 1895 –
1920 period did not show much temperature increase.
This framework is relatively comparable to the IPCC at the
global level.
In view of the above, it is concerning that 18 of the 24 cities
have already experienced temperature increases greater
than 1.5 degree Celsius; 11 over 2 degree Celsius; and even
1 over 3 degree Celsius. And, that is right now!
All mentioned temperature increases are very likely to be
much greater by 2100.
Data source: NOAA
6. 6
Ranking cities by most recent
10-year temperature increase using
LOESS estimates
I grouped the cities in 6 different categories
in terms of most recent temperature
increase over the past 10 years (using
LOESS smoothed estimates). And, I also
looked at if this specific temperature
increase appeared to be rising, decreasing,
or flat.
When I indicated “Decreasing?”, the
interrogation mark suggests there is much
uncertainty about the trend.
10-year most recent temperature increase using LOESS est.
Data source: NOAA
7. 7
Temperature increase estimates by 2100
Here I simply take the LOESS estimated temperature increase over the
most recent past 10 years (also shown on the previous slide). And, I
assume that this temperature increase pace remains constant until
2100. And, I calculate the temperature increase since the onset or the
average over the 1895 – 1920 period.
For instance, for the top line for Fresno, the temperature increased
already by 2.2 degrees Celsius since the average of 1895 – 1920. the
LOWESS estimates for the most recent 10 years indicate that
temperatures keep on rising by 0.89 degree Celsius per decade. And,
if we project this current pace of temperature increase, it results in an
estimated temperature increase of 9.2 degree Celsius by 2100 since
the 1895 – 1920 baseline period. As shown, this would describe a
devastating scenario for Fresno’s environment.
As described only 3 cities would remain within reasonably benign
scenarios (blue and green zone). All other cities would be associated
with progressively more adverse to ultimately devastating scenarios as
you move up from yellow to the deep red zone.
Data source: NOAA
8. 8
Temperature increase between now (2021) and 2100
We are looking at four different forecasts or scenarios of
temperature increase between now and 2100 measured in degree
Celsius.
The first forecast is based on our LOESS estimates, the second one is
based on just projecting average long term historical trends.
The next two forecasts are from the NOAA Climate Explorer that
weight the average of 32 Climate Change models output. They
model the yearly average daily maximum temperature. The NOAA
models forecast at the county level, and aggregate US cities within
such counties.
The NOAA Higher forecast is associated with a continuation of
greenhouse gas emissions. The NOAA Lower forecast assumes
emissions peak in 2040, and drop very quickly thereafter.
I feel the divergence in measurements do not affect the relevance of
the comparison between the LOESS and NOAA models.
Data source: NOAA
9. 9
Focusing on the two most interesting scenarios
I eliminated two scenarios because they appear highly unlikely.
I eliminated the “Historical trend” scenario because its long term trend
(linear) was biased downward by the effect of SO2 emissions that caused a
rapid drop in temperature between 1935 and 1970 (more info on this
phenomenon a bit later in this presentation). So, this scenario fails to
capture contemporary trend.
I eliminated the NOAA Lower scenario as its assumption regarding
greenhouse gas emissions peaking in 2040, and rapidly declining thereafter
is too optimistic.
We will study the divergence between the LOESS and NOAA models later in
this presentation.
Data source: NOAA
10. 10
Observing the historical temperature data in degree
Fahrenheit with LOESS estimates and Confidence Intervals
to get an idea of the underlying trend, including
directional shifts over time (up or down, decreasing or
declining)
11. 11
Anatomy of a graph
The volatile black line depicts the 10 year
change in temperature (Fahrenheit), based on
10 year moving averages.
The blue line reflects this 10 year change
smoothed out using LOESS regression
estimates.
The light blue zone around the blue line
depicts a 0.9 or 90% Confidence Interval
around the LOESS data point estimates.
Data source: NOAA
12. 12
Three different trend patterns: M, V, J
M J
V
The 24 cities temperature increase trends typically fall into
one of those three trend patterns: M, V, J.
Data source: NOAA
13. 13
All patterns show a common decline in temperature
from about 1935 to 1970 because of a rise in sulfur
dioxide (SO2), and an increase in temperature after
1970, in part, because of a decline in SO2.
SO2 emissions result from volcanoes
eruptions, forest fires, coal electricity
generation, petroleum extraction, and
manufacturing.
SO2 emissions cause acid rain and ozone
layer depletion.
In the 1960s and 70s, the US, Canada, and
Europe passed environmental regulations
that rapidly curbed SO2 emissions within
these regions.
14. 14
Source: armstrongeconomics.com
Source: “The Problem with Sulfur Dioxide”.
September 9, 2016. cullycleanair.org
SO2 emissions have rapidly declined in the U.S. But, the distribution of SO2
emission is very uneven.
Although, higher SO2 emissions appear to be concentrated in the Midwest, South, and East; they also appear
concentrated around cities elsewhere. However, disparity in local SO2 emissions may explain some of the
divergence in cities temperature increase patterns since 1970.
15. Greater than 1 Degree Fahrenheit over 10 years (as specified) and Rising
15
These cities are in very serious trouble with rapidly rising temperatures. They are already relatively hot with average
temperatures of 67, 71, and 74 degrees F respectively over the past 5 years (2017 – 2021) vs. 56 degrees F for the 24 cities
average. If temperature continues rising at the current pace over the next 40 years, Fresno’s temperature could rise by 6.4
degrees Fahrenheit. The two Texan cities temperature would rise by about 4.4 degrees. These estimates assume that the
temperature increase continue at current pace. The LOESS estimates trends suggest temperatures could rise faster than
the current pace. This is an alarming prospect. Data source: NOAA
16. > 0.75 < 1.00 Degree Fahrenheit over 10 years (as specified) and Rising
16
These cities are also in trouble with rising temperatures. Their respective average temp. over the past 5 years (2017 –
2021) are: Albany 51, Baton Rouge 70, New York 56 F. If temperatures keep rising at the current pace, these cities
would experience a rise in temp. from 3.0 to 3.8 Fahrenheit. But, the rising LOESS estimates trends suggest the
temperature increases could be much higher. That is a concerning outlook. Data source: NOAA
17. > 0.75 < 1.00 Degree Fahrenheit over 10 years and Flat trend
17
Burlington has a couple of positive factors including a relatively cool average temperature over the past 5 years (48
degrees). And, the LOESS estimates suggests that the current trend in temperature increase is flat. Still, the current
pace of temperature increase would result in a 3.3 degree Fahrenheit rise over the next 40 years. Burlington would
still remain a very livable city. However, the economies of nearby ski resorts may be impaired during the Winter
season.
Data source: NOAA
18. > 0.50 < 0.75 Degree Fahrenheit over 10 years (as specified) and Rising
18
The three cities have average temperatures of 48, 51, and 62 F respectively. Their respective temperature increase at
current pace, as specified, is 0.73, 0.63, and 0.53 F over 10 years. Fortunately, the warmer of the three cities (Greenville)
is associated with the lower pace of temperature increase. At current pace, Greenville’s temperature could increase by
2.1 degree Fahrenheit over the next 40 years. The other two cities could see increases in the 2.5 to near 3.0 range.
However, of concern is that all temperature increases could accelerate when focusing on the LOESS estimates trends.
Data source: NOAA
19. > 0.50 < 0.75 Degree Fahrenheit over 10 years (as specified) and Flat
19
Roswell has an average temperature of 64 degrees F over past five years. Buffalo… a temperature of 51 degrees F.
Based on current pace, both cities may experience a temperature increase of about 2.4 degrees Fahrenheit. Both
trends are relatively flat. The prospective temperature increases appear manageable.
Data source: NOAA
20. > 0.30 < 0.50 Degree Fahrenheit over 10 years (as specified) and Decreasing?
20
These cities are interesting because the trend in the LOESS estimates (decreasing) appear divergent from the underlying
data. This is especially true for Reno where the most recent 10 years show a rise in temperature > 1.5 degree F and
rising rapidly relative to earlier data points. But, the LOESS estimates are associated with a temperature rise < 0.50 over
the same period. Both cities have temperate averages of 56 and 47 degrees F, respectively. Based on current (LOESS)
pace, their temperature could rise by a moderate 2 degrees F over the next 40 years. However, the LOESS curves could
turn back up in the next few years and suggest a far more rapid temperature increase (especially for Reno).
Data source: NOAA
21. > 0.30 < 0.50 Degree Fahrenheit over 10 years (as specified) and Decreasing
21
These cities are less exposed to climate change. Over the next 40 years, their respective temperature may increase by
about 1.3 degree Fahrenheit or less, given that the LOESS estimates are trending downward.
Data source: NOAA
22. > 0.30 < 0.50 Degree Fahrenheit over 10 years (as specified) and Rising
22
Cheyenne has a mild average temperature of 48 degrees F over the past five years. Its current pace of temperature
increase is a moderate 0.37 degree F based on LOESS estimates. However, when looking at the underlying data, it is
close to 1 full degree F. The rising trend, and the divergence between LOESS estimates and underlying data render
prospective rise over the next 40 years rather uncertain. Just projecting the two different paces (LOESS vs. underlying
data), temperature could rise between 1.4 and 3.6 degrees F over the next 40 years.
Data source: NOAA
23. > 0.15 < 0.30 Degree Fahrenheit over 10 years (as specified) and Decreasing
23
These three cities do not appear overly exposed to climate change. Based on current pace, they could incur a 1 degree
Fahrenheit increase over the next 40 years… or less given the decreasing trends in the LOESS estimates.
Data source: NOAA
24. > 0.15 < 0.30 Degree and Flat or Rising
24
Spokane is another city associated with much uncertainty. Its LOESS estimates suggests a modest 0.23 degree F rise
per decade, or less than 1 degree Fahrenheit over 40 years. But, if you look at the underlying data the most recent 10
year change in temperature is about 1.4 degree F. This would suggest a very broad range in prospective temperature
increase over the next 40 years from 1 to 6 degrees F… ranging from the benign to the alarming.
Data source: NOAA
25. < 0.15 Decreasing or Rising?
25
Another couple of interesting cities. The LOESS estimates trends suggest that these two cities will not experience much
if any temperature increase over the next few decades if the current pace and LOESS trends persist. However, the
underlying data shows rapidly rising temperatures with a current pace over past 10 years close to 1 degree Fahrenheit or
4 degrees over the next 40 years. Data source: NOAA
26. < 0.10 Degree Fahrenheit over 10 years (as specified) and Decreasing
26
Aberdeen is also associated with a bit of uncertainty. The LOESS estimates trend suggests an ongoing small decline in
temperature going forward. The underlying most recent data suggests an increase of about 0.35 degree Fahrenheit over
10 years. That would translate into a 1.4 degree F increase over the next 40 years.
Data source: NOAA
27. 27
Revisiting a comparison between
the LOESS and the NOAA Higher
forecast for temperature increase
out to 2100 vs history
Data source: NOAA
28. 28
Explaining the NOAA Higher forecast within The Climate Explorer graph
Source: NOAA
The NOAA Higher forecast is
represented by the red line. It
captures the weighted average
aggregation of 32 separate climate
models that forecast temperatures F
at the county level. This is the
forecast I use throughout this
presentation.
For further explanation of all other
elements of this graph, please refer
to the next slide that discloses the
NOAA’s own explanation.
30. 30
LOESS vs. NOAA forecasts vs. history: Aberdeen vs. Fresno
Based on historical and current trends,
there is a huge divergence between
the two cities. Aberdeen
temperatures appear to clearly
decline. Meanwhile, Fresno’s are
exploding upward.
The LOESS model reflects this
divergence out to 2100 with Fresno’s
temperature rising by 7 degree Celsius,
and Aberdeen’s decreasing by – 1.3
degree Celsius. Meanwhile, the NOAA
models completely miss out on this
divergence and actually forecasts that
Aberdeen’s temperature will grow
faster than Fresno’s (+ 4.6 C, + 3.8 C,
respectively).
Data source: NOAA
31. 31
LOESS vs. NOAA forecasts vs. history: Corpus Christi vs. San Antonio
The two cities have archetype
patterns of M and J (the M
reflects most recent 10 year
periods temperature increase
slowing down; and J such
temperatures accelerating
upward).
The NOAA models miss much of
this large divergence as they have
Corpus Christi and San Antonio
temperatures increasing by
similar amounts by 2100 at + 3.4
and + 4.5 degree Celsius
respectively. Meanwhile the
LOESS model
fully captures this divergence with + 0.9 C for Corpus Christi vs. + 5.0 C for San Antonio.
Notice that the LOESS and NOAA models are fairly close regarding San Antonio: + 5.0 C vs. + 4.5 C, respectively.
32. 32
LOESS vs. NOAA forecasts vs. history: Great Falls vs. El Paso
Another classic comparison between an
M and a hybrid J/V shape. Great Falls
show a declining rate of temperature
increase in the most recent period. El
Paso shows an ongoing rapid increase in
such temperatures.
The LOESS model captures that
divergence with + 1.1 C for Great Falls
and + 4.6 C for El Paso. Meanwhile, the
NOAA models make little difference
between the two with both around + 4.3
C to + 4.4 C over the same period.
Notice that the LOESS and NOAA models are very close regarding El Paso: + 4.6 C vs. + 4.4 C,
respectively.
Data source: NOAA
33. 33
LOESS vs. NOAA forecasts vs. history: Lexington vs. Albany
Nearly same comment as for
Great Falls vs. El Paso.
LOESS reflects the divergence in
trends (M vs. hybrid M/V). It
estimates the temperature
increase at + 1.2 C for Lexington
and + 4.1 C for Albany.
Meanwhile, the NOAA models
make little difference between
the two cities (about + 4.3 C to
4.4 C).
The LOESS and NOAA models are
very close with Albany: + 4.1 C vs.
+ 4.4 C, respectively.
Data source: NOAA
34. 34
LOESS vs. NOAA forecasts vs. history: Minneapolis vs. Baton Rouge
Nearly same comment as for Great
Falls vs. El Paso.
LOESS reflects the divergence in trends
(M vs. V). It estimates the temperature
increase at + 1.4 C for Minneapolis and
+ 3.5 C for Baton Rouge. Meanwhile,
the NOAA models gets the divergence
between both cities in the wrong
direction with Minneapolis coming in
much higher at + 4.7 C and Baton
Rouge much lower at + 3.4 C.
The LOESS and NOAA models have
almost the same estimate for Baton
Rouge: + 3.5 C and + 3.4 C, respectively.
Data source: NOAA
35. 35
LOESS vs. NOAA forecasts vs. history: Cape Hatteras vs. New York
In most recent periods, the temperature
increase is decelerating for Cape Hatteras
and accelerating for New York.
The LOESS model captures that divergence
with + 1.5 C for Cape Hatteras vs. + 3.4 C
for New York.
The NOAA models do not capture much of
this divergence with + 3.3 C for Cape
Hatteras vs. + 4.2 C for New York.
Data source: NOAA
36. 36
LOESS vs. NOAA forecasts vs. history: Cheyenne vs. Portland
At the outset, temperatures in Portland
are rising much more consistently than in
Cheyenne.
The LOESS model captures that nuance
with Cheyenne coming in at + 1.6 C and
Portland at + 3.2 C.
The NOAA models do not differentiate
much between the two with Cheyenne at
+ 4.5 C and Portland at + 4.4 C.
Data source: NOAA
37. 37
LOESS vs. NOAA forecasts vs. history: Greenbay vs. Greenville
This is a classic M vs. V pattern. The
LOESS correctly estimates that the
temperatures are rising more slowly
for the M than for the V. Greenbay
comes in at + 1.8 C and Greenville at +
2.3 C.
Meanwhile, the NOAA models miss
out the correct divergence between
the M and the V pattern with
Greenbay coming in faster at + 4.7 C
and Greenville slower at + 3.9 C.
Data source: NOAA
38. 38
The NOAA models just make
very little differentiation
between the 24 different cities
(or counties) regardless of the
marked divergence in the
historical data.
A simple LOESS model makes
far more differentiation
between the cities.
39. 39
LOESS estimated temperature increases are in average a lot lower
than the NOAA. But, the LOESS estimates are far more disperse
with a far wider range of forecasts. Meanwhile, the NOAA
forecasts all converge closely to their average of 4.2 degree Celsius.
Data source: NOAA
40. 40
Why are the two models forecasts so different?
They have a very different structure
NOAA LOESS
Number of sub-models used The NOAA forecast is based on 32 models. They use a
weighted average of the 32 models to generate forecasts.
LOESS uses a single model
for each city.
Geographic level The cities are grouped at the county level. City level
Most recent date in historical
sample of the model(s)
2006. That’s despite the model(s) being finalized in 2013
and refined in 2015.
2021
Variables used in the model The NOAA model(s) is a complex Climate Change model that
uses numerous greenhouse gases emissions as exogenous
independent variables.
It uses the historical
temperature data as an
endogenous variable.
How does model forecast? The 32 models make projections for all the greenhouse
gases, and then calculate what is the resulting temperature.
It uses the calculated
underlying trend in temp.
rise over most recent 10
years. It assumes trend
remains constant going
forward out to 2100.
41. 41
LOESS model strength
More often than not, the LOESS model does capture
the historical trend, and the most relevant
contemporary trend in the data.
And, its forecasts clearly differentiate between two
cities diverging trends (M vs. V shape). We saw this
comparison between the two cities earlier. And, the
LOESS model estimated that temperatures out to
2100 would rise more slowly in Minneapolis (+ 1.4
C) vs. Baton Rouge (+ 3.4 C). This seems like a
reasonable forecast.
Data source: NOAA
42. 42
LOESS Model Weaknesses
LOESS regression can overfit the historical data.
LOESS regression is not well catered to long term
forecasting (very few models are).
This LOESS model seems in some cases to have a
lagging effect, whereby its underlying regression
trend curve does not seem to capture the actual
trend in the recent actual data points.
For Astoria and Aberdeen, the LOESS model suggests that the recent trend curve is associated with either
very low temperature increase or even negative temperature change over the most recent 10 years. But, the
actual data shows very rapid change in temperature. Projecting this LOESS trend out to 2100 could
underestimate temperature increase by a huge amount.
As structured, this LOESS model does not use any exogenous greenhouse gases variables. Therefore, it can’t
be sensitized to different greenhouse gases emissions scenarios.
Data source: NOAA
43. 43
NOAA model(s) strengths
The NOAA model can sensitize the forecasts to
different greenhouse gases emission scenarios.
Using this 32 model approach, it allows to generate
very interesting uncertainty bands around the
weighted average estimates. This makes for a
valuable substitution to Confidence Intervals around
the model’s estimates.
This framework is associated with spectacular data
visualization.
Source: NOAA
44. 44
NOAA Model(s) Weaknesses
Even though it is currently shown at the NOAA website as
its most current version, this model relies on historical data
that is already over 15 years out of date. Since then,
temperature trends have changed a lot.
Even more puzzling is that NOAA has the most recent
historical data going out to 2021.
In view of the above, it is not surprising that the NOAA
forecasts does not reflect contemporary historical trends.
Going back to Minneapolis and Baton Rouge, even though Baton Rouge’s temperature has
been rising much more rapidly than Minneapolis since right around 2006 (when the NOAA
model historical data stops), this model forecasts that Minneapolis temperature will rise
faster than Baton Rouge ( + 4.7 C vs. + 3.4 C, respectively).
Overall, this model makes little differentiation between the various counties/cities. See box
plot to the right. For all its model complexities (aggregation of 32 models), the NOAA could
have just about taken the average + 4.2 C out to 2100, and it pretty much would have
reflected the forecast for any of the mentioned cities.
Data source: NOAA
45. 45
NOAA Model(s) Weaknesses continued
If you look closely at the historical visual data (for New York in
this specific case), the NOAA model does not appear to fit it
well. The NOAA does not disclose the weighted average
estimates during the history which should have been
represented by a black line.
The NOAA just showed the minimum and maximum of their
32 models.
If the NOAA model fit was good, the minimum and maximum estimates should be symmetric around the
historical temperature data points. They are often not [symmetric]. This is a reflection of a poor historical fit.
Source: NOAA
46. 46
Considerations
Complex does not mean better. The NOAA model(s) fails on a couple of critical counts.
First, it does not appear to fit the historical data well enough to make for relevant forecasts.
Second, it does not differentiate much between various counties/cities.
The LOESS model, as mentioned, is associated with all the limitation of LOESS. It may overfit the historical
data, and it is not really appropriate to generate long term forecasts (as stated, few models are).
Nevertheless, this far simpler LOESS model appear to fit the historical data better, and differentiates far more
between cities than the NOAA model(s).
This is not a ratification of the LOESS methodology, but more of a criticism of the NOAA model(s) that needs
to go through a complete revision to ensure resolving the mentioned deficiencies.