Climate Change: Science Versus Consensus and Alarmism A Court Case Reveals the Falsehoods in Al Gore’s “ An Inconvenient Truth ”, Climate Change Reality, the Role of the Sun, and Energy Solutions Matt Bailey Desert Research Institute
This PowerPoint is not intended as a presentation for a lecture (its too long, with a lot of text). It is meant to be a resource for those interested in the counter arguments against global warming alarmism (not global warming) in a condensed format (it’s a big subject). It is divided into four sections: an introduction with some background and philosophy of science stuff; the court case in the UK that refuted nearly all of Al Gore’s An Inconvenient Truth ; the presentation of the non-alarmist or rational scientific view; and finally, a conclusion that considers solutions to global pollution.
The data presented here are taken from peer reviewed journals and presentations from qualified climate scientists. It also includes material from the United Nations Intergovernmental Panel on Climate Change, or IPCC. Journal citations, sources, and links are listed whenever possible so that the reader can verify the data as presented.
I am not a formal climatologist, but few in the field of climate change are. I am a physicist by training, and currently do research in cloud physics and lightning. There are many sub-disciplines of research relevant to climate change, including meteorology, climatology, atmospheric chemistry and physics, geology, paleoceanography, astrophysics, quaternary science, statistics, and modeling.
Disclaimer: Opinions are the author’s and are not necessarily shared by the Desert Research Institute, “but they should be” (paraphrasing Bob Parks, U. Maryland).
Section 1 (turquoise slides) – Background: climate science distortion, skepticism, principles of science (a short intro, then lots of facts).
Section 2 (green slides) - The inaccuracies covered in the court case that challenged the mandated showing of An Inconvenient Truth to elementary school children in the UK: Kilimanjaro, CO 2 and temperature, hurricanes, Lake Chad, drowning polar bears, shut-down of the Gulf Stream, loss of coral, sea level rise and Greenland-Antarctica, sea level rise and islands.
Section 3 (yellow slides) - Historical temperatures (distant past to last decade), temperature variability, the urban heat island effect and errors in US and global temperature measurements, non-greenhouse gas heating due to smog particles in the lower atmosphere (China, India, Asia, and underdeveloped countries), black carbon (soot particles) role in Arctic ice loss, recent warming and solar activity versus CO 2 , clouds and climate, the Sun’s role in temperature change, the Sun’s effect on climate via cosmic rays and cloud formation, the near future.
Section 4 (turquoise slides) - Conclusions: temperature, CO 2 versus the Sun, future projection, solutions (practical versus idealistic), arguments for sustainable nuclear energy, meltdown-proof nuclear reactors, energy statistics, radiation health statistics, websites.
Science versus Consensus and Alarmism A little patience here, please. The process of science needs to be discussed.
Global Warming is a Fact! Contrary to the anthropogenic global warming (AGW) alarmists, few scientists deny that the Earth has been warming, for about the last 150 years following the 450 year “ Mini-Ice Age ”. Global warming skeptics or rationalists claim that the warming is largely due to natural variations in the Earth’s climate, of which there is ample evidence. Predictions and claims of alarming climate change come almost exclusively from crude or overly simplistic computer models, not climate data, which show essentially no change in global temperature since the mid-1990’s . Recent and past analysis of climate data has revealed that the statistical methods and models used by those who promote AGW are highly flawed. Some of these model results were intentionally manipulated so as to give the impression of AGW. Mark Twain once said “There are three kinds of lies. Lies, damn lies, and statistics.” While statistics are important to any scientific argument, they have their limits. This presentation will not bomb the reader with statistical arguments, but it will show data and uncertainties that the layman and scientist alike can appreciate, from which rational conclusions can be drawn.
Science is constrained by data, but models and opinions aren’t. Since the early 1990’s, the IPCC has issued a number of reports containing model predictions of global climate change. These models are incomplete because of a general lack of knowledge concerning key factors which determine the temperature of the atmosphere. Among these is the Sun’s energy input and the response of the coupled ocean-atmosphere system. The models also rely on several untested assumptions concerning the strengths of factors that lead to warming or cooling, emphasizing only those which might lead to warming. When models don’t agree with data, the model is changed, not the data. In the game of science, data always trump theory.
Skepticism In science, skepticism does not refer to the habitual denial of facts, but is instead one of the guiding principles of science. Scientists are trained to be skeptics. If a scientist makes a claim or presents a hypothesis, the burden is on that scientist to prove the claim with open data and analysis that can pass the scrutiny of fellow scientists or “peers” in the field. Hypotheses are one thing, data is another. If the two don’t support each other, it is the hypotheses that are suspect, not the data, and the proper scientific method requires that scientists be skeptical of the hypotheses. This is how science works!
The Skeptics I and many of my colleagues are “skeptics” of the global warming alarmist view because we are scientists. Contrary to what Al Gore claims, there are more than a “few dozen” of us. A petition signed by 400 top scientists was recently sent to the US Senate Environment and Public Works Committee, http://epw.senate.gov/public/index.cfm?FuseAction=Minority.SenateReport#report and over 31,000 have signed a petition protesting the alarmism being perpetrated upon the public. The current list of 31,072 petition signers includes 9,021 PhD; 6,961 MS; 2,240 MD and DVM; and 12,850 BS or equivalent academic degrees. Most of the MD and DVM signers also have underlying degrees in basic science . This stands in stark contradiction to the claim by Al Gore that a scientific consensus exists supporting man-caused or anthropogenic global warming. http://www.oism.org/pproject/s33p1845.htm Add to this the scientists who attended the 2008 International Conference on Climate Change (ICCC) and signed the Manhat-tan Declaration along with over 600 qualified supporters, and those who attended the 2009 ICCC in New York C . Many more have remained silent rather than risk their careers or the ire of colleagues and administrators. http://www.climatescienceinternational.org/
Consensus of the top scientists? Richard Lindzen , atmospheric physicist and Arthur P. Sloan Professor of Meteorology at MIT , is a former member of the IPCC and author of past IPCC reports . In the program, “ The Great Global Warming Swindle ”, Lindzen points out that the IPCC’s claim, that it’s reports are endorsed by “2500 top scientists”, simply isn’t true. Many of the members of the IPCC are not scientists at all, but are instead political appointees from the member countries. Some of the scientist on the IPCC’s list are no longer members of the IPCC, having left the panel or quit, often in protest, but their names continue to appear on the reports after they leave. The IPCC reports are written by a relatively small number of qualified scientists, and are endorsed by a much larger number of mostly unqualified appointees, “none of whom are asked if they agree or disagree with the report “ http://en.wikipedia.org/wiki/Richard_Lindzen
The Debate Is In Full Swing The debate is in full swing in spite of Gore’s proclamations to the contrary. However, many scientists who have embraced the consensus claimed by Gore have obviously not taken the time to look at the data, and have instead accepted the opinions of the alarmists simply based on what they heard in the media or from a few vocal advocates of the alarmist view which the media was quick to feature in sensationalist reports. To quote Richard Lindzen “… there are the numerous well meaning individuals who have allowed propagandists to convince them that in accepting the alarmist view of anthropogenic climate change, they are displaying intelligence and virtue.” Many of these well meaning individuals are politicians or educators in our elementary schools and Universities, and many of them have made this a political as well as a moral issue.
Consensus The late Michael Crichton (best known for his novels like Jurassic Park and State of Fear, but also a graduate of Harvard Medical School and a former postdoctoral fellow at the Salk Institute for Biological Studies) said it best when he warned his audience of the dangers of "consensus science" in a 2003 speech. In part, he said, "Historically, the claim of consensus is a way to avoid debate by claiming that the matter is already settled.” "Let's be clear: the work of science has nothing whatever to do with consensus. "Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus."
The Story Has Changed 2007, 2008, and 2009 saw a surge of new studies published in the peer review journals that refute or disprove many of the claims which constitute the “consensus” view of global warming promoted by the IPCC (I have cited these sources or provided links throughout this presentation). This is also the case with many publications from professional organizations such as the American Geophysical Union (AGU), the American Meteor-logical Society (AMS), and the University Cooperative for Atmospheric Research (UCAR) to name a few. The public is mostly in the dark concerning these recent revelations, and the alarmist view has received so much press, and makes such a good story, that it has become dogma
Scientific Distortion The media and politicians are notorious for distorting the facts in order to serve their own agendas. “ An Inconvenient Truth ” by Al Gore is an extreme example of just how badly the unqualified can twist what scientists actually know (the movie even distorts what the IPCC has published). This problem is clearly revealed in the book “ Meltdown: The Predictable Distortion of Global Warming by Scientists, Politicians, and the Media ” by Patrick Michaels ( reviewed and highly recommended in the Bulletin of the American Meteorological Society ).
Scientific Distortion Some scientists are also to blame. As John Christy* notes in “ The Great Global Warming Swindle ”, many climate scientists have a vested interest in the hysteria because this is how they get funding. There has been at least 15 years of funding proposal announce-ments whose stated goal was to find possible anthropogenic causes of global warming. As Richard Lindzen noted in the GGWS , the one thing you don’t ever report is that what you found might not be a problem, because then you would not receive any more funding. *John Christy is the Distinguished Professor of Atmospheric Science and Director of the Earth System Science Center at the University of Alabama in Huntsville (UAH). For his development of a global temperature data set from satellites, he was awarded NASA's Medal for Exceptional Scientific Achievement , and the American Meteorological Society's (AMS) "Special Award.“ In 2002, Christy was elected Fellow of the AMS. In 2007, Christy refused the Nobel Prize awarded to him and all members of the IPCC, and Al Gore.
It doesn’t take a scientist to recognize the trend in the data at left. A scandal erupted in November 2009 over inter-cepted emails concerning the manipu-lation of climate data by the Climate Research Unit (CRU) of East Anglia University (UEA) in the UK, a climate institute that feeds models and pre-dictions to the United Nations Inter-governmental Panel on Climate Change, the IPCC . Emails from Kevin Trenberth , head of the Climate Analysis Section at the National Center for Atmospheric Research ( NCAR ) in the US, state “the fact is that we cant account for the lack of warming at the moment and it’s a travesty that we cant.” The “moment” is over 10 years long. And additionally, from Phil Jones at UEA , “I’ve just completed Mike’s trick... to hide the decline” of the last 6 years. ( More about these graphs later!) Magenta and blue lines: temperature anomaly data. Green line: CO 2 concentrations in part per million.
Revelations of Fraud There are many scientists who disagree with the alarmist interpretation of climate change. However their opinions are rarely heard in the mainstream media who supported the alarmist position before listening to the dissenting point of view from a vast number of qualified scientists and climatologists. It comes as no surprise that the media is mostly silent on the recent revelations concerning climate data manipulations referred to as “Climategate”. This is in addition to the silencing of Alan Carlin of the EPA for his report challenging the EPA’s own data concerning global warming and CO 2 , and their recently won court case that gave the EPA the authority to regulate CO 2 as a pollutant. For the last several years, the public has been pummeled with special reports and multi-part series on the looming catastrophe of AGW (e.g. CNN’s “Planet in Peril”). The media stuck their necks out supporting the AGW alarmists, and now their credibility is at risk. Don’t expect them to admit any errors soon.
In case you missed this, since it got so little press, CBS news reported in June 2009 that a member of the EPA, Alan Carlin, was silenced when he submitted a report challenging the EPA’s data and subsequent conclusions concerning CO 2 and global warming. You can read Carlin’s report at http://cei.org/cei_files/fm/active/0/DOC062509-004.pdf The Environmental Protection Agency may have suppressed an internal report that was skeptical of claims about global warming, including whether carbon dioxide must be strictly regulated by the federal government, according to a series of newly disclosed e-mail messages. Less than two weeks before the agency formally submitted its pro-regulation recommendation to the White House, an EPA center director quashed a 98-page report that warned against making hasty "decisions based on a scientific hypothesis that does not appear to explain most of the available data.“ The EPA official, Al McGartland, said in an e-mail message to a staff researcher on March 17: "The administrator and the administration has decided to move forward... and your comments do not help the legal or policy case for this decision." The e-mail correspondence raises questions about political interference in what was supposed to be a independent review process inside a federal agency -- and echoes criticisms of the EPA under the Bush administration, which was accused of suppressing a pro-climate change document. Alan Carlin, the primary author of the 98-page EPA report, told CBSNews.com in a telephone interview on Friday that his boss, McGartland, was being pressured himself. "It was his view that he either lost his job or he got me working on something else," Carlin said. "That was obviously coming from higher levels." E-mail messages released this week show that Carlin was ordered not to "have any direct communication" with anyone outside his small group at EPA on the topic of climate change, and was informed that his report would not be shared with the agency group working on the topic. http://www.cbsnews.com/blogs/2009/06/26/politics/politicalhotsheet/entry5117890.shtml
The Earth’s average temperature has increased over the last 150 years
by about 1 degree Celsius, or 1.8 degrees Fahrenheit, following a 450
year Mini-Ice Age that saw some historically low temperatures (the
Maunder Minimum) and regularly brutalized our ancestors with crop
failures and famines. Since then, due to the warming of the climate and
advances in technology and agriculture, human life expectancy has
nearly doubled, and agricultural production has quintupled, mirrored
closely by world population growth . http://en.wikipedia.org/wiki/Maunder_minimum
The Frozen Thames, 1677 . Between 1400 and the nineteenth century there were a total of 23 documented winters in which the Thames froze over at London, 25 if you include "more or less frozen over" years which are shown in parentheses: 1408, 1435, 1506, 1514, 1537, 1565, 1595, 1608, 1621, 1635, 1649, 1655, 1663, 1666, 1677, 1684, 1695, 1709, 1716, 1740, (1768), 1776, (1785), 1795 and 1814.
The Earth cooled to its current state during a cycle of glacial-interglacial periods, beginning about 4 million years ago.
The average temperature has dropped by about 6 o C or about 11 o F, with the variation in temperature , between warm and cold periods, covering nearly 10 o C or about 18 o F.
The last interglacial period, the Eemian, was warmer than today, and sea levels were 4-6 meters higher than they are now, indicating greater deglaciation than today.
The Earth recently returned to its historical average temperature around 1980 after being below average for nearly 600 years . The Earth is presently about 0.2-0.6 o C above its average, if thermometer measurements are accurate. They are not, as will be shown!
Many of the graphs you will see, like the one above, plot temperature “anomalies” which are departures from the average temperature, itself an uncertain quantity.
Anthropogenic emissions of greenhouse gases (predominantly carbon dioxide) have increased since the industrial revolution, but only by a tiny amount compared to the natural levels of all greenhouse gases in the atmosphere, which are dominated by water vapor. However, as some current climate models claim, these emissions rose to levels significant enough to cause a measurable effect on global temperature in the last few decades of the 20 th century. However, 2007, 2008, and 2009 have seen an increasing number of studies published in the peer reviewed journals that indicate the models have significantly overestimated the sensitivity of the climate to changes in carbon dioxide as will be shown.
Like the temperature graph on the left, most temperature records plot the “anomalies” or departures from an average temperature. The graph on the right is the same as that on the left, but shows the temperature variations with respect to the freezing point of water (0 o C or 32 o F), a temperature relevant to the massive polar regions. Choice of scale can have a dramatic effect on the impression imparted by a set of data. The mean or average temperature shown here is higher than some values found in the literature. A mean temperature includes geographically averaged air temperatures over the land (both desert and tundra) and the oceans (tropical and arctic), including seasonal variations, over a specified period of time. So average values can vary considerably. [GISS = NASA’s Goddard Institute for Space Science] GISS Global Mean Temperature 1880-2003
Climate variability and change are the norm, not the exception. The Earth’s climate history, both long and short term, is fairly well documented. Before the 1970’s, the accepted theory was that the Sun was the main driving force behind climate change and variability, along with periodic changes in the Earth’s orbit. Starting around 1800, the Earth slowly and erratically warmed for about 150 years, but then experienced a significant cooling between1945 and 1975 as carbon dioxide levels rose due to post WWII reconstruction and economic expansion. These changes were well within the historical limits of climate variability, however the cooling led some climate scientists to conclude that a new ice age was looming. Time Magazine published an article reflecting this prediction in June 1974. ( http://www.time.com/time/magazine/article/0,9171,944914,00.html )
B.P. = before present
Estimated past temperatures from the 1990 IPCC Report, Figure 7.1.c (red), MBH 1999 40 year average used in IPCC TAR 2001 (blue), and Moberg et al. 2005 low frequency signal (black) Past temperatures are reconstructed from a number of “proxies”, measurable parameters that are assumed to reflect past temperatures. Tree ring analysis (dendrochronology) is one type of proxy. The width of tree rings is one indication of growth conditions, including rainfall and solar intensity. Another proxy, considered more accurate than most, including tree rings, is bore hole analysis. Geologists and glaciologists have been drilling deep holes into the Earth’s surface for many decades. The temperature gradients in these holes, the rate at which the temperature changes with depth, preserves and reveals a long term record of surface temperatures. The figure at upper left shows a curve in red from the IPCC’s 1990 report that is a bore hole record from Greenland that reveals the pronounced Medieval Warm Period. The existence of this warm period has now been fully corroborated by several independent studies and other proxy analyses. The middle figure shows the now infamous “Hockey Stick” model reconstruction by Michael Mann et al. (1999) which has been showed to be highly flawed due to the biased selection of particular tree ring data and the method used for smoothing and fitting the data. Nevertheless, the IPCC has continued to include the Hockey Stick results in their models (right figure), the effect of doing so being that the intensity of both the Medieval Warm Period and the Little Ice Age are reduced, and the current instrumental temperatures, covering barely the past 100 years, are artificially elevated. The “Hockey Stick” reconstruction of past temperatures by Mann et al. (1999) using selected tree ring data. The particular selection of data and the calculations leading to this result have been thoroughly discredited. A collection of temperature reconstructions which appear in the most recent IPCC report. (FAR). Temperature Reconstructions
http://www.whoi.edu/page.do?pid=7545&tid=282&cid=59106&ct=162 Sea surface temperature reconstructions from the Indo-Pacific Warm Pool. Different colored symbols indicate data from different cores used in the reconstruction. A northern hemisphere temperature reconstruction from Mann et al. (2008) is shown in the black curve. The previously published data is from Newton et al. (2006). Colored lines are the average of the data points. Triangles at the bottom of the figure show where age control exists. The horizontal black line labeled 1997-2007 Mean Annual SST shows the value of the annual average sea surface temperature for the same time period. The Little Ice Age, which occurred around A.D. 1700, was a cool period, but its magnitude was only about 0.5 to 1˚C cooler than modern winter temperatures. Water temperature during the late Medieval Warm Period, between about A.D. 1000 to 1250, was within error of modern annual sea surface temperatures. (Oppo, Rosenthal, Linsley; 2009). So, the current sea surface temperatures are about equal to those of the Medieval Warm Period, long before man emitted any significant CO 2 . “ The Woods Hole Oceanographic Institution embraces the Medieval Warm Period: contradicts Mann’s proxy data.”
In the mid-70’s, the Earth underwent one of its periodic climate shifts, which is reflected in the multi-decadal temperature oscillations of the Pacific Ocean basin that covers half of the planet. The temperature began to increase again. By the late 1980’s, some climate scientists began to warn of runaway global warming. Pollution was thought to be the culprit, i.e. the warming effects of greenhouse gases . Over the last 130 years, the Earth warmed, then cooled, and then warmed again. The Sun showed strikingly similar activity during this period, increasing, decreasing, and then increasing in radiant output, but its influence was dismissed in favor of computer models that stressed the estimated effects of carbon dioxide.
Al Gore told a very different story in An Inconvenient Truth!
Pacific Decadal Oscillations
A Debate Mostly Out of the Public View Al Gore and the alarmists captured the public’s attention first, while scientists with opposing views mostly conducted the debate out of the public’s view through the proper scientific channels, i.e. peer reviewed journals, publications, and conferences. Coupling a polished video production with testimonials from an occasional “expert” or “leading scientist” from NASA or the IPCC, Gore went on a media circuit blitz to promote his film, stating that his views represented the majority opinion or a consensus of scientists in the field. A position of certainty was presented to the mostly uninformed public and media (always in search of a sensational story) who were unaware of a vast number of scientists with dissenting opinions. It is well known throughout human history, and especially in politics, that if you state distortions or lies often enough, they begin to take on an “aura of truth”. The late Democrat senator Daniel Patrick Moynihan famously said that “people are entitled to their own opinions, but not their own facts”.
A Nobel Prize but Not for Achievements in Science
In November 2007. the Nobel Prize Committee jointly awarded the Peace Prize to the members* of the United Nation’s Intergovernmental Panel on Climate Change (IPCC) and Al Gore for " for their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change “.
Just three days before this, a judge in the UK ruled that “ An Inconvenient Truth ” (AIT ) contained 9** significant inaccuracies that were proven false by the scientific data or could not be substantiated, and “ that the film is a political work and promotes only one side of the argument ”.
Some of the evidence the judge heard was drawn from data in the IPCC reports and was used to refute Al Gore’s claims.
**Not all of the inaccuracies in the film were fully considered by the court as the judge requested a sample on which to consider the case. Witness statements lists 20 inaccuracies in the film. *The award was given to past and present members of the IPCC, many of whom have quit the IPCC in protest, reporting that the IPCC lacks proper scientific credibility. John Christy refused the prize.
The film claims that melting snows on Mount Kilimanjaro evidence global warming. The Government’s expert was forced to concede that this is not correct.
The film suggests that evidence from ice cores proves that rising CO2 causes temperature increases over 650,000 years. The Court found that the film was misleading: over that period the rises in CO2 lagged behind the temperature rises by 800-2000 years.
The film uses emotive images of Hurricane Katrina and suggests that this has been caused by global warming. The Government’s expert had to accept that it was “not possible” to attribute one-off events to global warming.
The film shows the drying up of Lake Chad and claims that this was caused by global warming. The Government’s expert had to accept that this was not the case.
The film claims that a study showed that polar bears had drowned due to disappearing arctic ice. It turned out that Mr. Gore had misread the study: in fact four polar bears drowned and this was because of a particularly violent storm.
The film threatens that global warming could stop the Gulf Stream throwing Europe into an ice age: the Claimant’s evidence was that this was a scientific impossibility.
The film blames global warming for species losses including coral reef bleaching. The Government could not find any evidence to support this claim.
The film suggests that sea levels could rise by 7m causing the displacement of millions of people. In fact the evidence is that sea levels are expected to rise by about 40cm over the next hundred years and that there is no such threat of massive migration.
The film claims that rising sea levels has caused the evacuation of certain Pacific islands to New Zealand. The Government are unable to substantiate this and the Court observed that this appears to be a false claim.
Not all of the inaccuracies in the film were fully considered by the court as the judge requested a sample on which to consider the case. Professor Carter's witness statement lists 20 inaccuracies in the film.
Inaccuracies in Al Gore's An Inconvenient Truth “ The decision by the government to distribute Al Gore's film An Inconvenient Truth has been the subject of a legal action by New Party member Stewart Dimmock. The Court found that the film was misleading in nine respects and that the Guidance Notes drafted by the Education Secretary’s advisors served only to exacerbate the political propaganda in the film.” “ In order for the film to be shown, the Government must first amend their Guidance Notes to Teachers to make clear that 1.) The Film is a political work and promotes only one side of the argument. 2.) If teachers present the Film without making this plain, they may be in breach of section 406 of the Education Act 1996 and guilty of political indoctrination. 3.) Nine inaccuracies have to be specifically drawn to the attention of school children.” The inaccuracies are : From a judge in the UK, in response to the governments decision to show “An Inconvenient Truth” to elementary school children and the challenge to this effort by the parent of a student.
Inaccuracies in Al Gore's An Inconvenient Truth 1. “The film claims that melting snows on Mount Kilimanjaro is evidence of global warming. The Government’s expert was forced to concede that this is not correct”. Mt. Kilimanjaro is near the equator in Tanzania with a peak elevation of 19,341 ft. East Central Africa, like Sub-Saharan Africa, has been experiencing decades of decreased temperatures and precipitation (click or roll forward). Glaciers, even if perpetually cold as in Antarctica, need to be fed with snowfall, or they retreat. They can melt if they get warm, or they can evaporate due to sunlight, even when cold. A recent study by Philip Mote of the University of Washington in the United States and Georg Kaser of the University of Innsbruck in Austria indicates that the shrinking of Kilimanjaro's ice cap is not due to global warming but instead driven by solar radiation. The glaciers on Kilimanjaro fluctuate considerably with the Earth’s constantly changing climate, and the ice observed over the last century was partially deposited during a 450 year mini-ice age. Late in 2007, heavy snowfall led to a slight increase in Kilimanjaro's overall ice volume. (Follow the link below for more on this.) http://news.bbc.co.uk/2/hi/science/nature/6561527.stm
In an Inconvenient Truth , Al Gore told the story of an old beloved professor of his who showed him a graph of temperature and carbon dioxide over the last 600 thousand years. He later stepped onto a cherry picker for dramatic effect and pointed to temperature and CO 2 curves and then asked the crowd, “do you think those two ever fit?”. He told the crowd to notice that when CO 2 was up, the temperature was up, and when CO 2 was down, the temperature was down, implying CO 2 determines temperature.
“ The film suggests that evidence from ice cores proves that rising CO 2 causes temperature increases over 650,000 years. The Court found that the film was misleading: over that period the rises in CO 2 lagged behind the temperature rises by 800-2000 years”.
650,000 year of climate history gleaned from the Vostok ice core obtained In Antarctica show that CO 2 has always lagged temperature change (the tempera-ture increases first, then CO 2 follows). The solubility of gases in liquids is a function of temperature. Colder liquids hold more gas (consider carbonated beverages). In the past as oceans warmed, some dissolved CO 2 was released. Levels of CO 2 , methane, and other greenhouse gases also increased due to increased biological activity. In AIT , Al Gore placed the CO 2 graph over the temperature graph, climbed on a cherry picker for dramatic effect, and then asked the audience if they thought these two ever “fit” together. He did not superimpose them on each other, or draw the lines shown at left, as it would have obviously disproved his assertion. present ->
In a recent television interview with Al Gore, when faced by a reporter with the court’s revelation about the Vostok ice core, and the peer reviewed literature that show historical temperature increases lead carbon dioxide increases, and not the other way around as he indicated in AIT , Al Gore responded by saying “both assessments are correct”. Spoken like a true politician! To paraphrase a saying from Texas, never let facts get in the way of telling a good story . In all fairness, most climate models do predict that once warming begins (i.e. due to natural driving forces), the addition of greenhouse gases should add to the warming, but only by a small amount. The question in dispute is, by how much? It is becoming clear from several recent studies that the current climate models significantly overestimate the response of the atmosphere to the recent increases in carbon dioxide (discussed later in the presentation). present ->
To quote professor Nir Shaviv of the Racah Institute of Physics (Israel) in the video “The Great Global Warming Swindle” produced in the UK in late 2007, “there were periods in Earth’s history when the levels of carbon dioxide were 3 times higher, or 10 times higher than they are today ( 380 parts per million or ppm ), and if CO 2 has a large effect on climate, then you should see it in the temperature reconstruction”, which is not the case. Carbon dioxide has never driven the climate, not now, nor in the past. Late Carboniferous to Early Permian time (315 mya -- 270 mya) is the only time period in the last 600 million years when both atmospheric CO2 and temperatures were as low as they are today (Quaternary Period ). ← 10 times higher than today, trees first appear ↓ 18 times higher than today 15 o C = current temp . dinosaurs
In case you didn’t catch this press release in the US media. China now no. 1 in CO 2 emissions; USA in second position Press release; 19 June 2007 Netherland Environmental Assessment Agency China’s 2006 CO 2 emissions surpassed those of the USA by 8%. This includes CO 2 emissions from industrial processes (cement production). With this China tops the list of CO 2 emitting countries for the first time. In 2005, CO 2 emissions from China were still 2% below those of the USA. These figures are based on a preliminary estimate by the Netherlands Environmental Assessment Agency (MNP), using recently published BP (British Petroleum) energy data and cement production data. China, USA & EU In 2006, the total of China’s CO 2 emissions from fossil fuels increased by 9%. In the USA in 2006, emissions decreased by 1.4% , compared to 2005. In the European Union countries (the ‘EU 15’) in that same year, CO 2 emissions from fossil fuels remained more or less constant; in 2005 there was a decrease by 0.8%, according to a recent report by the EEA compiling data from the member states. The European Union is the third-largest carbon dioxide emitter, but its levels are about half of China's. The next biggest emitters are Russia, India and Japan. The lack of change in the EU emissions shows that they are not meeting their Kyoto Protocol targets. Except for the underdeveloped countries, most of the signatory countries will not meet their goals by 2012 as set by the treaty, except by purchasing carbon emission credits from other countries whose emissions basically haven’t changed, so no real change in CO 2 emissions.
Hurricane experts at the National Hurricane Center (NHC) in Florida routinely refute claims about the link between hurricanes and global warming, noting that hurricanes follow a multi-decadal cycle that does not appear to be tied to global warming. Hurricane frequency and strength depend on sea surface temperatures which follow a 25-40 year cycle of minima and maxima as shown above (like the “AMO” and “NAO”). The Atlantic Basin is currently in a phase of higher hurricane frequency, as expected. Inaccuracies in Al Gore's An Inconvenient Truth 3. “ The film uses emotive images of Hurricane Katrina and suggests that this has been caused by global warming. The Government’s expert had to accept that it was not possible to attribute one-off events to global warming”. average = 1.64 per year
The hurricanes of 2005, including Katrina, were often reported in the media as being due to global warming. In fact, that year’s hurricanes were not remarkable consider-ing the long term record kept by the National Hurricane Center (NHC). All of these hurricanes were category 3 (on a scale of 1-5) or less when they made landfall (Katrina had dropped to category 2 by the time it hit New Orleans). 2005 was remarkable in that it saw a record number of named tropical storms (27 storms weaker than category 1 hurricanes), 15 of which turned into hurricanes. Hurricane expert Kevin Trenberth at the National Center for Atmospheric Research in Denver predicted that 2006 would see 17 named storms and 9 hurricanes. Instead, there were 9 named storms, 5 hurricanes (three cat. 1, two cat. 3). None made US landfall in 2006 (or 2000)! 2005 2006 Zero ! 2005
Hurricane predictions for 2007 were basically correct in terms of the number of named storms, but over predicted the number of intense storms by about a factor of 2. Only two were intense hurricanes (they hit Mexico) and only one (Humberto, a category 1) briefly hit the US. 2008 saw 2 US landfall hurricanes , Gustav and Ike (category 2-3), half the number predicted by the experts. 2009 saw zero! Global warming may reduce the risk of landfall hurricanes. (see NOAA link) http://www.noaanews.noaa.gov/stories2008/20080122_warmeroceans.html
An article from EOS, Transactions of the American Geological Union (AGU), a science publication that openly leans towards AGW alarmism. The final paragraph says it all, and this trend was a continuation from earlier decades up to 2005. More recently, the Army Corp of Engineers has been blamed due to its actions involving the channeling of the flow of the Mississippi River downstream from New Orleans.
But we are also told by Gore that droughts are increasing. Above is data on US droughts from the US National Climate Data Center (NCDC).
Gore also said that violent weather, such as floods were increasing. Above is data on US rainfall from the US National Climate Data Center (NCDC).
“ The film shows the drying up of Lake Chad and claims that this was caused by global warming. The Government’s expert had to accept that this was not the case”.
(see story below).
Africa's Lake Chad Shrinks By 20 Times Due To Irrigation Demands, Climate
ScienceDaily (Mar. 1, 2001) — In the 1960s, North central Africa's Lake Chad was larger
than the state of Vermont but is now smaller than Rhode Island. NASA-funded
researchers using computer models and climate data now understand why Africa's
freshwater Lake Chad has been disappearing over the last 30 years.
Using model and climate data, Coe and Foley calculate that a 30 percent decrease took
place in the lake between 1966 and 1975 (the global temperature was in decline during
this period ) . Irrigation only accounted for 5 percent of that decrease, with drier conditions
accounting for the remainder. They noticed that irrigation demands increased four-fold
between 1983 and 1994, accounting for 50 percent of the additional decrease in the size
of the lake (click or scroll forward).
The Northern Africa Sahel region has experienced numerous devastating droughts over
the last four decades. "Climate data has shown a great decrease in rainfall since the
early 1960's largely due to a decrease in the number of large rainfall events," Coe said.
In “An Inconvenient Truth” the retreat of the Aral Sea in Kazakhstan and Uzbekistan is portrayed as being caused by global warming. The Aral Sea has retreated, not because of global warming, but because the Soviets diverted the two rivers that feed it, starting in the 1940’s, in order to grow rice and cotton. The lake level progressively dropped, (click) and the loss of clouds and precipitation normally generated by evaporation from the surface of the sea has led to desertification in the region.
From the World Wildlife Federation Canada website . “ Polar Bears in Canada: The current facts ( January 2007 ) As some recent media reports have mistakenly cited incorrect facts about Canadian and circumarctic polar bears, WWF-Canada provides a brief summary of the most important facts about Canadian polar bears. In this way we hope that readers will be able to base their thinking, writing and decisions on accurate facts, not distorted information. Much of this information is contained in the recently published 190-page report from the World Conservation Union's Polar Bear Specialist Group most recent Working Meeting (see POLAR BEAR FAQ at http://pbsg.npolar.no/ ). Range and Numbers There are currently 19 populations of polar bears in the Arctic, in Canada, Alaska (USA), Russia, Svalbard (Norway) and Greenland (Denmark). Thirteen of these populations occur either wholly or partially in Canada, ranging from the Ontario shores of Hudson Bay as far north as Ellesmere Island, Nunavut, and from northern Yukon in the west to Labrador in the east. Twelve of these populations occur at least partially in Nunavut, the Inuit autonomous region in northern Canada. Polar bears often travel huge distances in their annual cycle. Because population estimates are very expensive to obtain in the Arctic, census data are patchy for some polar bear popula-tions. The current overall estimate is of 20-25,000 wild polar bears, with approximately 15,000 (about 2/3) occurring in Canada. The current population is about 33% higher than it was in the 1970’s due to a moratorium on polar bear hunting in Canada, which some want relaxed. http://www.cbc.ca/canada/north/story/2007/04/25/arviat-bears.html A successful hunt.
These figures show polar bear populations (left) and Arctic temperature trends (right). These show that, in general, where temperature has decreased, so have polar bear populations, and where temperature has increased or remained constant, polar bear populations have increased or remained constant (for a guide to abbreviations, see next slide). Habit encroachment and hunting are also factors in some regions, e.g. Hudson Bay and Nunavut (extreme northern Canada).
Trends in Canadian Polar Bear Populations From IUCN 2006, Polar Bear Specialist Group Proceedings from 2005 meeting. Much of the data in the IUCN Proceedings was provided by the Government of Nunavut, which participated fully in the production of the status report. Polar bears branched off from grizzly bears about 250,000 years ago, probably when a population of Arctic grizzlies was isolated by glacial barriers during an ice age (glacial period). Since then they have survived two interglacial warm periods, including the Holocene Climatic Optimum, around 8000 years ago, when temperatures in the Arctic were warmer than today for almost 3000 years, with the Arctic experiencing periods where it was largely ice free during summer months. Decline of populations in western Greenland (BB) and in Nunavut (northern Canada) is partially due to over hunting. . http://www.polarbearsinternational.org/bear-facts/
Polar bears, like their close cousins, grizzly bears, are opportunistic feeders, feeding on most anything edible: dead, alive, fresh, or rotten, like the whale carcass in this image. With the recent warming, grizzly bears are extending their range northwards into the traditional range of the polar bears. Polar bears will likely alter their dietary habits in response to changing climate conditions, as the grizzly bears are.
The plight of wildlife, whether real or imagined, often elicits an emotional response from the public. TV stars appear in commercials claiming that the polar bears are about to go extinct. The US has put them on the “watch list” under the Endangered Species Act, while Canada has designated them a species “of special concern” (both are two steps down from “endangered”, “threatened” being next). Again, polar bear numbers are up by about 1/3 over their numbers in the 1970’s due to a ban on hunting, which has since been relaxed. The Inuit of Nunavut claim that world governments are interfering with their stewardship. The Inuit have the right to hunt and conduct sport hunting expeditions in their territory, and some Inuits claim that polar bear numbers are on the rise in some areas. The US and the EU have banned the import of polar bear skins, effectively putting a ban on hunting. The Inuit have stated that they will take their quota of bears anyway, but in the mean time, claim they are being robbed of an important source of revenue through a non-existent wildlife problem (they get about $50K per sport hunting party). Surveys of polar bears, especially in Russia, are incomplete or absent. So neither side has an accurate count, only their opinion. This is also the case for those who claim with certainty that the polar bears are disappearing. Should we trust the Inuit in their own land? Nunavut
The seas will part, the Sun will stand still, and other myths.
Inaccuracies in Al Gore's An Inconvenient Truth 6. “The film threatens that global warming could stop the Gulf Stream, throwing Europe into an ice age: the Claimant’s evidence was that this was a scientific impossibility”. This figure shows the “global conveyor” or the thermohaline circulation of the Earth’s oceans driven by temperature and salinity differences in masses of ocean water. Red represents warm surface water while blue represents colder saltier water that sinks because it is more dense. The erroneous idea that the current warming could stop the Gulf Stream stems from the distortion of a single climate event that occurred about 13,000 years ago at the end of the last ice age. This event is referred to as the Younger Dryas. The Younger Dryas resulted from a rapid cooling with a return to glacial conditions in the higher latitudes of the Northern Hemisphere between 12,900 – 11,500 years before present (BP). One theory holds that the Younger Dryas was caused by a significant reduction or shutdown of the North Atlantic thermohaline circulation.
The shutdown may have been in response to a sudden influx of fresh water from Lake Agazziz, a huge ice-dammed lake in Canada near the present day Great Lakes. According to the theory, the dam broke and a huge volume of fresh water flowed eastward through the Great Lakes and down the St. Lawrence River into the North Atlantic Ocean, diluting the Gulf Stream with fresh water, and shutting down the thermohaline conveyor, plunging the Northern hemisphere briefly back into an ice age-like condition. There are many problems with this theory, including the fact that South America cooled first, and some evidence indicates the water flowed south down the Mississippi River (many oceanographers dismiss this as the reason for the disruption of the global conveyor). The sudden cooling event may have been caused by a small asteroid impact (the proposed Younger Dryas impact event) or a sudden overturning of mid or deep ocean layers. No matter what the cause, there is currently no process that could suddenly produce any where close to such a volume of fresh water and release it into the oceans. Any melting of ice in the Arctic or Antarctic would cause very slow changes in the thermohaline circulation.
Inaccuracies in Al Gore's An Inconvenient Truth 7. “The film blames global warming for species losses including coral reef bleaching. The Government could not find any evidence to support this claim”. Corals consist of a limestone structure filled with thousands of small animals called polyps. Algae called zooxanthellae live within each coral. In return for a safe sunny home, the zooxanthellae eat the nitrogen waste that the coral produces (nitrogen is very good for algal growth) and, like all plants, algae turn sunlight into sugars by the process of photosynthesis. The sugars produced by the zooxanthellae make up 98 per cent of the coral's food. Rising water temperatures block the photosynthetic reaction that converts carbon dioxide into sugar. This results in a build-up of products that poison the zooxanthellae. To save itself, the coral spits out the zooxanthellae and some of its own tissue, leaving the coral a bleached white. The bleached coral can recover, but only if cooler water temperatures return and the algae are able to grow again. Without the zooxanthellae, the coral slowly starves to death (click or scroll forward). Other causes of coral bleaching Apart from heat stress, other causes of coral bleaching may include: increased exposure to ultraviolet (UV) radiation; [ solar UV output has been increasing for decades ] large amounts of storm water from heavy rains flooding the reef; the exposure of coral to certain chemicals or diseases; [ pollution of the oceans ] sediments such as sand or dirt covering the coral; [ tropical storms can ravage reefs ] excess nutrients such as ammonia and nitrate from fertilizers and household products entering the reef ecosystem. (The nutrients might increase the number of zooxanthellae in the coral, but it is possible that the nutrient overload increases the susceptibility of coral to diseases.) Often coral reefs are exposed to a combination of these factors. http://www.science.org.au/nova/076/076key.htm Coral polyp showing its tiny zooxanthellae, seen as small brown dots. Source Kirsten Michalek-Wagner bleached staghorn coral
ScienceDaily (Jan. 1, 2009) — A team of scientists from the New York-based Wildlife Conservation Society (WCS) has reported a rapid recovery of coral reefs in areas of Indonesia, following the tsunami that devastated coastal regions throughout the Indian Ocean on December 26, 2004.
Man is helping too. A successful coral transplant site in Aceh, Indonesia, some four years after the tsunami. http://www.sciencedaily.com/releases/2008/12/081227225250.htm The WCS team, working in conjunction with the Australian Research Council Centre of Excellence for Coral Reef Studies (ARCCoERS) along with government, community and non-government partners, has documented high densities of “baby corals” in areas that were severely impacted by the tsunami. The team, which has surveyed the region’s coral reefs since the December 26, 2004 tsunami, looked at 60 sites along 800 kilometers (497 miles) of coastline in Aceh, Indonesia. The researchers attribute the recovery to natural colonization by resilient coral species, along with the reduction of destructive fishing practices by local communities.
Inaccuracies in Al Gore's An Inconvenient Truth 8. “The film suggests that sea levels could rise by 7 meters, causing the displacement of millions of people. In fact the evidence is that sea levels are expected to rise by about 40cm (16 inches) over the next hundred years and that there is no such threat of massive migration”. A 7 meter sea level rise (about 23 ft) can easily be calculated by assuming that all of the Greenland ice mass melts. This is an example problem in the text book I use to teach an atmospheric physics course. If all the ice in Greenland were to melt, a 7 meter sea level rise would result. But the question is, how long would that take and how likely is that to happen. Satellite image of southern and central Greenland.
There was apparently much less coastal ice than average during the Viking migrations, and there were also fewer icebergs which usually discouraged navigation of the North Atlantic. Greenland has gone through considerable fluctuations in temperature and ice coverage (mass) during the last 2000 years. The Vikings called it “Greenland” for a reason. Much like today there were green margins along the southwest tip where Viking settlers farmed and fished. The colonies lasted from 982 to about 1400 and then collapsed due to the onset of the Little Ice Age. The current temperatures are about what they were during the Viking era.
"The Norwegian farmer Folke Vilgerdson made the first attempt to settle in Iceland in about 865 AD... He lost his cattle in a severe winter, and disappointed, went back to Norway after having seen a fjord filled up by sea ice . Therefore he called the country Iceland . Only a few years later, in 874, Ingolf Arnason succeeded. He was followed by many others, and settlement was completed in 930 AD. In 982, Erik the Red discovered new land West of Iceland. He called it Greenland ; according to the Greenlander Saga, this was only to persuade people to follow him... But the 18 O curve (an isotope of oxygen that indicates past temperatures) suggests that the name described a reality . So the drastic climatic change [warming] late in the ninth century may be part of the reason why Iceland and Greenland did not get the opposite names." "The beneficent times came to an end. Sea ice and stormier seas made the passages between Norway, Iceland and Greenland more difficult after AD 1200 . In mainland Europe, disastrous harvests were experienced in the latter part of the thirteenth and in the early fourteenth century." The cold decades of 1680-1700 are very well documented , at least in Europe. The glaciers in the Alps increased , there was no good wine, harvests were a catastrophe and famine killed like the black death centuries before. The decade of 1810-1820 was also quite cold, including "the summer that did not come" or a " year without summer ". The Tambora volcanic eruption has been accused for this summer less year 1816. Maybe it helped a little, but the cold spell had already begun from the spotless ( sunspots ) year 1810, with which Tambora had nothing to do. For more interesting details on how history has been affected by climate, go to http://www.tilmari.pp.fi/tilmari6.htm and follow the links within, especially . http://personal.inet.fi/tiede/tilmari/sunspot5.html#some200
The temperatures in Greenland have varied significantly over the last two millennia. Borehole temperatures obtained during the Greenland Ice Core Project (GRIP) show that temperatures during the Medieval Warm Period (800 -1200 AD) were higher than they are today. Borehole temperature profiles are considered to be direct measure-ments of past temperature, and more accurate than proxy indicators. In the early 20 th century, Greenland was much warmer than today. It saw significant melting prior to 1940, then underwent 60 years of slow cooling, and recently (2003) has begun to warm again. Different regions of Greenland show similar temperature trends, but some show more warming or cooling than others.
Greenland has experienced a loss of lower elevation ice over the last few decades, but as satellite observations show, ice mass increased in the interior of Greenland up until 2003, such that there has been only a tiny fraction of 1% ice mass loss. Fluctuations of this magnitude are common, not unprecedented, in the history of Greenland Ice loss is in yellow-to-red colors, ice gains in green-to- blue. Ice mass loss is often highest in Spring to Fall as expected, but not always as the data from 1998-2003 show.
Summary of changes in ice mass coverage for the interior of Greenland (> 1500 meters elevation) versus the coastal slopes and margins. Contact: Mariangela D'Acunto [email_address] 39-069-418-0856 European Space Agency ERS altimeter survey shows growth of Greenland Ice Sheet interior Summary of the 1992-2003 changes in ice mass coverage, showing an increase in the interior of Greenland. decreased thickness increased thickness
In An Inconvenient Truth, Al Gore warns that if the Greenland Ice Sheet melts, the oceans will rise by 7 meters. This can be confirmed by calculation. Greenland contains 1.2 x 10 6 Gigatons of ice, so at a melting rate of 100 Gton/year (the 2003-2005 observa- tion), it will take 12,000 years to melt it all, which will cause a sea level rise of 7 meters, or about 23 feet, assuming the melting rate stays at its current level, and that all the ice will melt (it never did during past extended warm periods). Note that the ice mass increased over Greenland from 1992-2002 as shown in the previous summary. This was due to increased precipitation (snow) in the interior regions. [The fourth column is simply the difference between columns 2 and 3, indicating net change].
While Arctic sea ice extent has been declining in recent decades, Antarctic sea ice extent has been increasing recently and reached a record extent in 2007. The increased ice mass in the Antarctic is due to slightly warmer sea temperatures that have led to increased surface evaporation and subsequent increases in precipitation. This effect of mild global warming is expected. The IPCC in their 2007 report states “Antarctic sea ice extent continues to show inter-annual variability and localized changes but no statistically significant average trends, consistent with the lack of warming reflected in atmospheric temperatures averaged across the region”. Between the Antarctic winter and summer, sea ice varies by about 13 million sq. km, out of a total of about 15 million sq. km, or by about 87%, every year!
The lowest temperature ever recorded in nature on Earth was −89.2°C (-128.6°F), July 21, 1983 at Vostok Station. The mean annual temperature of the interior is −57°C (−70°F). Monthly means at McMurdo Station range from −28°C (−18.4°F) in August to −3°C (26.6°F) in January (Antarctic summer). At the South Pole, a high of −14°C (7°F) has been recorded. Along the Antarctic Peninsula, temperatures as high as 15°C (59°F) have been recorded, though the summer temperature usually is around 2°C (35.6°F) . A small region of the Antarctic Peninsula has warmed by about 2 o C over the last 25 years. The trends elsewhere show both warming and cooling, but are smaller and dependent on season and the time span over which the trend is computed. Climate models predict that future trends in Antarctica are much smaller than in the Arctic. The entire region has warmed on average by about 1 o C over the last century. As one climatologist put it, “the Antarctic has gone from butt-freezing cold to one degree above butt-freezing cold”. This small increase has led to increased evaporation over the ocean with subsequent increase in precipitation and sea ice extent. Vostok Station McMurdo Station
Researchers have calculated the ice mass changes for the two major ice sheets across Antarctica -- the Western Antarctic Ice Sheet (WAIS) and the larger Eastern Antarctic Ice Sheet (EAIS) -- which together cover the vast majority of the continent. Measurements shows that the there is no trend in the EAIS (which is about 3 times as large as the WAIS) and that virtually all of the mass loss is coming from the WAIS. The figure on the right shows the ice mass variations over the West Antarctic Ice Sheet (red) and the East Antarctic Ice Sheet (green). To quote the IPCC from page 17 of their Summary for Policy Makers (IPCC-SPM2feb07), “ Current global model studies project that the Antarctic ice sheet will remain too cold for widespread surface melting and is expected to gain in mass due to increased snowfall .” It should be emphasized that the Earth has been in a warm inter-glacial period for over 16,000 years, and its not over yet. Antarctica contains 90% of the world’s ice!
In Fall 2007 it was widely reported that the Arctic sea ice had retreated to a record low amount of coverage and thickness (satellite observations of the Arctic only date back to 1979) During the winter of 2008, the Arctic experienced an early and extended period of cold that led to an enormous recovery of sea ice and ice thickness, contributing to the overall global increase in sea ice coverage, most of which has occurred in the Antarctic. Regardless of the drop in 2007 and the recovery in 2008, this graph shows a regular trend of a yearly change in ice cover of about 10 million square kilometers that reaches a minimum every August at the end of the Arctic summer, as it has for millennia. ← the big drop ← the big recovery
Has the Arctic Ocean always had ice in summer? We know for sure that at least in the distant past, the Arctic was ice-free. Fossils from the age of the dinosaurs, 65 million years ago, indicate a temperate climate with ferns and other lush vegetation. Based on the paleoclimate record from ice and ocean cores, the last warm period in the Arctic peaked about 8,000 years ago, during the so-called Holocene Thermal Maximum. A recent study suggests that 5,500 years ago, the Arctic had substantially less summertime sea ice than today. However, it is not clear that the Arctic was completely free of summertime sea ice during this time. The last time that scientists can say confidently that the Arctic was free of summertime ice was 125,000 years ago, during the height of the last major interglacial period, known as the Eemian. Temperatures in the Arctic were warmer than now and sea level was also 4 to 6 meters (13 to 20 feet) higher than it is today because the Greenland and Antarctic Ice Sheets had partly melted. http://nsidc.org/arcticseaicenews/faq.html Polar bears evolved about 100,000 years prior to the Eemian interglacial period and survived an Arctic that was ice free during the Summer. The Arctic always has ice during Winter, since the cycle of glacial-interglacial periods began.
Update on recent ice coverage. Top: total global sea ice coverage, dominated by Antarctica. Blue curve shows daily total sea ice coverage. Lower red curve is the global sea ice “anomaly”, the variation with respect to an average value calculated for the period 1979-present. Left: Current extent of Arctic Sea Ice.
Glaciers have been in retreat long before the mid 20 th century when CO 2 levels began to rise significantly. Glaciers reached a peak during the Mini-Ice Age and began retreating when solar activity began to increase again in the 1800’s. In the Alps during the Mini-Ice Age, priests were called upon to pray in front of advancing glaciers that threatened to crush alpine villages. Glaciers, in some cases, are retreating to their previous positions that they held before the Mini-Ice Age, during the Medieval Warm Period. Today there are some places where glaciers are advancing (e.g. Scandinavia). Glaciers advance and retreat according to regional precipitation patterns, in addition to temperature. Glaciers Have Been Retreating Far Longer than We Have Emitted CO 2
Above is the record of the advance and retreat of the Grösser Aletsch glacier in the Swiss Alps. Grove and Switsur write: “Dating of organic material closely associated with moraines in many montane regions has reached the point where it is possible to survey available information concerning the timing of the medieval warm period. The results suggest that it was a global event occurring between about 900 and 1250 AD, possibly interrupted by a minor re-advance of ice between about 1050 and 1150 AD.” Lower part of the glacier
With most of the significant ice melt occurring in Greenland and land masses in the Northern Hemisphere, not Antarctica, what does the IPCC predict? TOPEX/Poseidon 3.1 mm per year = 31 cm per century = 12.2 inches per century (300% uncertainty) (167% uncertainty) (30% uncertainty) (22% uncertainty)
Since about 16,000 years ago, sea level has risen by about 400 feet (120 meters) due to the end of the last Ice Age. Sea level rise up to about 8 thousand years ago was mainly due to massive amounts of ice melt. It has risen by about 7.8 inches (20 cm) over the last 130 years mainly due to the expansion of warming seas and continuing ice melt caused by recent warming. The melting of floating sea ice, like the arctic ice cap, has almost no effect on sea level rise (see Archimedes principle, next slide). Ice piled up on land (Greenland, Antarctica, Canada, Siberia) can raise sea levels if it melts. The melting of land based ice causes nearly half of the present sea level rise. The Earth is currently in a warm interglacial period; there is more ice to melt, and sea levels will continue to rise.
updated 5:59 p.m. PT, Sat., Nov. 17, 2007 VALENCIA, Spain - Global warming is “unequivocal” and carbon dioxide already in the atmosphere commits the world to sea levels rising an average of up to 4.6 feet , the world’s top climate experts warned Saturday in their most authoritative report to date. “ Only urgent, global action will do,” said U.N. Secretary-General Ban Ki-Moon , calling on the United States and China — the world’s two biggest polluters — to do more to slow global climate change. “ I look forward to seeing the U.S. and China playing a more constructive role,” Ban told reporters. “Both countries can lead in their own way.” But how long will it take for a rise of 4.6 feet ? Answer: About 400 years! But this is according to extreme estimates of sea level rise. Future sea level rise has recently been downgraded to between 2 and 2.5 mm per year, approximately equal to the rate over the last two thousand years or more (click to see previous slide). U.N. issues landmark report on global warming. Panel offers dire warnings, establishes scientific baseline for political talks today 10 BC
In a paper titled "The Melting of Floating Ice will Raise the Ocean Level“ submitted to Geophysical Journal International, Noerdlinger demonstrates that if the world’s floating sea ice melted, sea levels would rise by about 4 cm or 1.57 inches , an insignificant rise. A common alarmist claim is that melting of the Arctic ice cap and other floating ice will raise sea levels. The ancient Greek, Archimedes, proved this incorrect. When fresh water ice melts while floating in fresh water, there is no change in the water level. Try it!. Put an ice cube in a glass of water, mark the level, then wait for the ice cube to melt. This well known physical phenomenon is often used by skeptics to counter arguments about melting ice and rising sea levels. However, arctic ice is fresh water ice floating in salt water (when sea water freezes, the salt is rejected, leaving the ice fresh). When the fresh water ice melts, it will displace a volume slightly larger than the one it displaced when frozen, raising the level slightly . Fresh water ice floating In salt water. fresh water ice melted In salt water.
Islanders are Fleeing Because of Rising Sea Levels?
(Not really. See Venice, Holland, and New Orleans.)
Inaccuracies in Al Gore's An Inconvenient Truth 9. “The film claims that rising sea levels has caused the evacuation of certain Pacific islands to New Zealand. The Government is unable to substantiate this and the Court observed that this appears to be a false claim”. While the global average sea level rise is about 2-3 millimeters per year (a little over one inch per decade), sea level varies both regionally and seasonally. Warmer water, in comparison with cold water, expands, which results in higher water levels. Relatively cooler water contracts and is lower in height. Spatial trend map over 1950–2003 of reconstructed sea level . Unit : mm/yr. Spatial trend map over 1993–2003 of reconstructed sea level .
Recent sea surface heights for late 2008 made by the satellite Jason , the successor to Topex-Poseidon , showing sea level depression in the central Pacific due to a persistent La Nina pattern. Tuvalu is one of the islands that has been highlighted by global warming alarmists. Says Auckland University climate scientist Chris de Frietas: "I can assure Mr. Gore that no one from the South Pacific islands has fled to New Zealand because of rising seas ." * Tuvalu
https://answers.google.com/answers/threadview?id=23863 For interesting comments on this subject, follow the links below. Some of the highlights are: “ The loss of island surface along the coastlines of Tuvalu is, according to scientific observation results, due to natural erosion by the surf and the effects of storms, not to a rising Pacific sea level.” “… population pressures are aiding the political drive to move people to Australia and New Zealand .“ “ Tuvalu is, in summary, overpopulated and has to face a number of problems: " …over fishing of lagoon & marine resources; sand mining; urbanization - limited fresh water supply; waste disposal; bio-diversity depletion due to improper use of modern technology .“ Overall sea levels are continuing to rise and people who live near coasts or on coral atolls only a few feat above sea level will have to adapt as they have in the past (consider Holland and Venice, Italy). Many ancient Roman and Greek ruins of coastal towns are now under more than 10 feet of water . Also, some islands and land masses are sinking (subsiding) into the Earth’s crust while others are growing due to active volcanism (e.g. Hawaii, Krakatau, etc.) and tectonic uplifting. http://rwdb.blogspot.com/2006/09/andrew-bolt-gets-lambert-treatment.html
Conveniently Stretched and Distorted the Facts and Known Climate Science
Facts always get in the way of a scary science fiction story!
(e.g. see The Day After Tomorrow )
Excerpts from the Second Witness Statement of Professor Robert M. Carter: Witness for the claimant (expert in the fields of geology-palaeoceanography -quaternary science). “ The IPCC claims to represent the consensus of the leading scientists in climate and related fields throughout the world… It is important to point out that the IPCC is itself an arm of the UN and therefore part of a political organization. Its panel members are selected by politicians and it has an agenda of its own. Several polls of professional opinion indicate that widespread scientific skepticism exists with the IPCC orthodoxy. As I have said in my first witness statement, the published science that the IPCC draws upon to make its recommendations is mostly authoritative and widely accepted but, at the same time, many distinguished and well qualified scientists, including myself, disagree with some of its findings. As I seek to illustrate, major parts of the film AIT are at odds with many of the findings of the IPCC, as they are also with the considered views of many independent expert scientists”. http://www.newparty.co.uk/articles/inaccuracies-gore.html http://www.newparty.co.uk/UserFiles/File/carterstatement.pdf
Biography of Professor Robert (Bob) M. Carter Bob Carter is a Research Professor at James Cook University (Queensland) and the University of Adelaide (South Australia). He is a palaeontologist, stratigrapher, marine geologist and environmental scientist with more than thirty years professional experience, and holds degrees from the University of Otago (New Zealand) and the University of Cambridge (England). He has held tenured academic staff positions at the University of Otago (Dunedin) and James Cook University (Townsville), where he was Professor and Head of School of Earth Sciences between 1981 and 1999. Bob has wide experience in management and research administration, including service as Chair of the Earth Sciences Discipline Panel of the Australian Research Council, Chair of the national Marine Science and Technologies Committee, Director of the Australian Office of the Ocean Drilling Program, and Co-Chief Scientist on ODP Leg 181 (Southwest Pacific Gateways). Bob Carter contributes regular media (and other) comment and opinion on scientific issues which relate to his areas of knowledge. He also offers lecture or workshop presentations by arrangement. His public commentaries draw on his knowledge of the scientific literature and a personal publication record of more than 100 papers in international science journals on topics which include taxonomic palaeontology, palaeoecology, the growth and form of the molluscan shell, New Zealand and Pacific geology, stratigraphic classification, sequence stratigraphy, sedimentology, the Great Barrier Reef, Quaternary geology, and sea-level and climate change. Bob Carter's current research on climate change, sea-level change and stratigraphy is based on field studies of Cenozoic sediments (last 65 million years) from the Southwest Pacific region, especially the Great Barrier Reef and New Zealand, and includes the analysis of marine sediment cores collected during ODP Leg 181. Bob's research has been supported by grants from competitive public research agencies, especially the Australian Research Council (ARC). He receives no research funding from special interest organizations such as environmental groups, energy companies or government departments. Bob strives to provide critical and dispassionate analysis based upon scientific principles, demonstrated facts and a knowledge of the scientific literature. Email: [email_address]
As mentioned in the introduction to this section, the inaccuracies discussed are just 9 out of 20 inaccuracies covered by witness statements in the court case addressing the “political propaganda” found in “ An Inconvenient Truth ”.
The Nobel Committee’s choice of the Peace Prize winner or winners is the decision of the Nobel Committee alone and does not involve international input other than the submission of nominations. The awarding of the Peace Prize is often intended to make a political statement as opposed to legitimately rewarding accomplishments. Turned down for Peace Prize Head of the PLO terrorist organization Global Warming Propagandist
The history of climate change: turning down the hype and the heat! But wait! Al has turned it up again.
The self-appointed scientist Al Gore, in a recent interview with Newsweek Magazine, stated that CO 2 was not responsible for the majority of global temperature increase. This comes after the revelation that the global temperature has been constant or in slow decline for the last 10 years while CO 2 has increased by about 4-5% (read about this later in the presentation). Gore claims CO 2 is now only responsible for about 43% of the warming, but this is according to some possible model scenarios by Shindell (2009) et al. which are only suggestions and have yet to be corroborated by any studies or data. Shindell, D.T. et al. Science 326, 716-718 (2009). http://www.tonightshowwithconanobrien.com/video/clips/al-gore-pt4-111209/1175411/ Gore appeared on the Conan Obrien Show on Nov. 12, 2009 to discuss his new book about climate change. While promoting geothermal power, Gore educated Conan and the audience about the Earth’s mantle, claiming the temperature of the Earth’s core is “several million degrees” and “the crust of the earth is hot”, Scientist Gore doesn’t have a clue about science. The Earth’s temperature “2 kilometers or so down” is between 500 o C and 900 o C , and the core has a temperature of about 5700 o C. Ask any high school or freshman geology student!
So, the exaggerations in “A n Inconvenient Truth ” have been revealed for what they are, and are easily refuted by legitimate science. Again, quoting the trial witness, “ Nowhere does Mr. Gore tell his audience or readers that all of the phenomena described in AIT fall within the natural range of previous environmental change on our planet ” (which will be shown shortly). The idea that the “debate is over”, and that the consensus is in, is wishful thinking by the alarmists who wish to avoid any real scientific debate. But the Earth is warming! So, what is causing the current warming, how much can we expect, and will it continue? Al Gore recently claimed that scientists have gone over the data “chapter and verse” and agree with his conclusions, but it is Mr. Gore who preaches climate Armageddon and imminent destruction like someone quoting chapter and verse from the Book of Revelations.
The Earth’s current temperature is higher than it was a century ago, but its been warmer and colder in the past. The warming that began in the 1800’s followed a 450 year mini-ice age . It can be shown that the modern instrumental measurement of regional temperature is highly flawed and has been biased by the urban heat island effect due to the poor placement of temperature measuring stations. This brings into question the actual global temperature increase.
Carbon dioxide (and other greenhouse gases) have increased due to natural forces and man’s activities. Are the increases in temperature and anthropogenic emissions over the last century connected? Do we have an accurate understanding of the sensitivity of the Earth’s atmosphere to increased levels of carbon dioxide? Are there other factors?
The Sun is the only significant source of energy in the Earth’s atmosphere and has always been a major factor behind climate change. Recent data shows that we have underestimated its effect and that there are solar factors we haven’t considered before whose influence is evident in the geologic record and the modern instrumental and satellite record.
The Global Temperature the Last 5 Million Years The Vostok ice core record shows that the Earth transitioned into an en-hanced cycle of glacial/interglacial periods beginning about 4 million years ago. Paleoclimatologists believe this was in response to the closing of the Isthmus of Panama by volcanic activity. Prior to the closing, the Pacific and Atlantic Oceans were in communication (via the Caribbean), with water mixing between the two bodies. After the closing, the Pacific and Atlantic became isolated from each other which affected global circulation patterns. On average, Antarctic temperatures declined and variability increased. The above record was reconstructed from oxygen-18 isotope abundances in ice cores, a “proxy” for global temperature in relation to global precipitation. “Proxies” are physical, measurable characteristics that are indicators of past temperature and climate. Lisiecki, L. E., and M. E. Raymo (2005), Paleoceanography ,
The Global Temperature the Last 160,000 Years This graph shows the two most recent interglacial periods, the Eemian interglacial and the present one starting about 16,000 years ago. Polar bears evolved as a separate species about 100,000 years before the Eemian interglacial and have survived both of these recent warmings (the current one will continue for several millennia). This temperature history has also been reconstructed from proxies which include isotopes (e.g. oxygen-18, carbon-14, berylium-10), sea and lake sediments, stalactites and stalagmites, and ice cores. These can be used to infer warm and cold climates, wet versus dry climates, and solar activity. [B.P. = “before present”] Nature vol. 329. pp. 403-408, 1987 present ->
Eemian , once called the Eemian Interglacial period, began about 130,000 years ago. Changes in the Earth’s orbital parameters (greater obliquity and eccentricity, and perihelion), known as the Milankovitch cycle, probably led to greater seasonal temperature variations in the Northern Hemisphere, although global annual mean temperatures were probably similar to those of the Holocene (the current period). The Eemian climate is believed to have been about as stable as, but probably warmer than, that of the Holocene. The warmest peak of the Eemian was around 125,000 years ago, when forests reached as far north as North Cape (which is now tundra) in northern Norway well above the Arctic Circle at 71°10′21″N 25°47′40″E / 71.1725°N 25.79444°E. Hardwood trees like hazel and oak grew as far north as Oulu, Finland . Sea levels at that time were 4-6 meters higher than they are now , indicating greater deglaciation than today (mostly from partial melting of the ice sheets of Greenland and Antarctica). One study published in July 2007 found evidence that Greenland could have contributed at most 2 m (6.6 ft) to sea level rise. Scandinavia was an island due to the inundation of vast areas of northern Europe and the West Siberian Plain.
The Global Temperature the Last 12,000 Years Climate changes are generally global in nature, but the Earth does not respond evenly. There is more land mass in the Northern Hemisphere than in the south, and the Pacific Ocean covers half of the Earth’s surface. The thick black line is a global average. ← present
Its warmer now than 150 years ago, which is great. 150 years ago, the Little Ice Age was finally coming to an end! But how much warmer?
warming cooling warming average warming rate The Local and Global Temperature the Last 130 Years When proxies like tree rings and stalactites, borehole temperature measurements, and the recent instrumental record are compared, a fairly consistent temperature record is obtained for the last century. The records above show warming from about the mid-1800’s to around 1940, followed by 35 years of cooling until about 1975 (carbon dioxide levels were increasing during this period), followed by more recent warming. However, there are still regional differences (click and note the relative stability of temperatures in Alaska for over 30 years). The recent warming has been greatest between 40°N and 70°N latitude, though some areas such as the North Atlantic Ocean have cooled in the recent decades. (58.3 o F)
The climate is constantly changing due to solar activity, the movements of continents and changes in the oceans , changes in the Earth’s orbit, low and high periods of volcanic activity, glacial-interglacial cycles, etc. Is the tem-perature of a million years ago, a thousand years ago, a century ago, or a century from now representative of the average? Is the past or current temperature the optimum temperature for the Earth? Nature vol. 329. pp. 403-408, 1987 present -> reversed time scale The recent warming, typically described in alarming terms (“its warmer than its been in 150 years”) is judged against the deep cool- Ing of the mini-Ice Age that reached a mini- mum between 1620 and 1670 (the Maunder Minimum). Many climatologists believe that the current warming is a natural climate recovery from this historic cold period. ← present
The Pacific Decadal Oscillation The Pacific Decadal Oscillation (PDO) is the strongest of the world’s ocean oscillation patterns, to which the others respond (the Pacific Ocean covers half of the planet). The PDO has been in its positive phase during the later half of the 20 th century, resulting in mild warming of the oceans and the atmosphere. It appears about to shift into a negative phase if it follows past behavior. The pattern oscillates with a period of about 40 years, as do patterns of Atlantic hurricanes.
Other Oscillations The Arctic Oscillation (AO) and the related North Atlantic Oscillations (NAO) refers to opposing atmospheric pressure patterns in northern middle and high latitudes. The patterns oscillate with a period of about 40 years and effect precipitation and temperature patterns from the Arctic to the Mediterranean. Since the 1970s, these oscillations have been in the positive phase , causing lower than normal arctic air pressure and higher than normal temperatures in much of the United States and Europe. It also results in reduced temperatures and precipitation in Western Greenland which is partially responsible for the retreat of glaciers in this area. This phase appears to have run its course and a negative phase is anticipated. Effects of the Positive Phase | Effects of the Negative Phase of the Arctic Oscillation of the Arctic Oscillation (Figures courtesy of J. Wallace, University of Washington) Negative Positive
The El Nino Southern Oscillation (ENSO) shows periodic, multidecadal
variability similar to the Pacific Decadal Oscillation (PDO). Hurricane
forecasters now know that if they wish to make predictions of Atlantic
hurricanes with any reliability, they must pay attention to what happens
in the Pacific. Analysis suggests strongly that U.S. Atlantic hurricane
damages are modulated by the phase of the ENSO, with increased losses
during La Niña events and reduced losses during El Niño events.
In La Nina years, cooler than normal water (blue in color) pools in the Eastern Pacific near the equator. In El Nino years, warmer than normal water (red) is present. http://www.aoml.noaa.gov/hrd/Landsea/lanina/index.html
The Pacific Decadal Oscillation If we compare the PDO and solar activity, it is clear that solar activity is driving the PDO.
One obvious feature in the previous temperature reconstructions is the frequent variability of the Earth’s climate on the scale of decades to millions of years.
Richard Alley , a glaciologist at Penn State University, has appeared on several television science programs, is an author of IPCC reports, and appeared before Congress to present the IPCC’s Fourth Atmospheric Report (FAR) for Policy Makers. In television interviews he emphasizes that the climate is, by nature, highly variable. He notes that the tempera-ture can undergo large swings on the scale of a decade or less at times. Sometimes these swings can be attributed to volcanic eruptions, asteroid impacts, or sudden changes in the Sun’s activity; on other occasions, the reasons are not clear. These changes can persist for varying amounts of time, but generally stay within the known range of temperature variations for glacial and interglacial periods.
In one National Geographic television program, Alley stated that what is unusual about the current climate is how relatively stable it has been over the last 10,000 years or so, compared to past climate fluctuations , in spite of the Holocene Optimum, the Younger Dryas, the Medieval Warming, the Little Ice Age, and the current warming. “Warm” interglacial periods are punctuated by sudden cooling events, and cold glacial periods exhibit sudden warmings.
There’s a problem with how and where we measure temperatures.
1871 War Department weather map showing sparseness of US meteorological stations. There are only a few stations west of the Mississippi River, and none west of the Rockies. The short 150 year instrumental record is extremely limited in spatial extent until recently. The US and Europe have the most extensive instrumental record. Much of the rest of the world is poorly represented or only have reliable records dating back several decades.
Except for the US, Europe, and Japan, most of the countries which supply temperature data for the Global Historical Climate Network (GHCN) have been measuring temperatures for only 70 years or less. The southern hemi- sphere is poorly represented compared with the northern hemisphere. Many of the current temperature measurement stations are contaminated by the urban heat island effect as will be shown. US temperature measurement stations account for about 50% of measurements globally.
Temperatures the Last 10 Years Contrary to what Al Gore and the media report, there has been hardly any change in Northern Hemisphere or global temperatures in 10 years, while CO 2 has continued to rise. This is obvious in satellite (Microwave Sounding Unit, U of Alabama, Huntsville) and combined ground and sea surface measurements (Hadley Center, UK). The Sun’s activity shows similar behavior over this period. The “R-squared” value of only 0.07 (right) statistically indicates that the temperature and CO 2 levels are uncorrelated or unrelated for the 10 year period considered. Past temperatures have been reconstructed with some uncertainties. It might be assumed that recent measurements with thermo-meters and modern electronic sensors are more accurate. But how accurate? = uncorrelated CO 2 temperature
Temperatures are often measured near concentrations of human activity, e.g. cities and suburban areas, airports, commercial zones, fire stations, etc. Concrete and asphalt, prevalent in these areas, retain heat deposited during the day, while homes, businesses, and traffic heat the air in these regions. Rural and urban temperature trends differ sharply, so rural temperature data probably best represents actual regional temperature changes. But even the rural temperature measurements are a problem, as will be shown.
No serious scientist doubts that the Earth has warmed over the last 150 years. Growing seasons are longer, some plants and animals have changed their habitation range in response to warming, and instrumental records show an increase in global temperature, etc. But how accurate is our measurement of the temperature change. Anthony Watts (UCAR-Boulder) discovered that there are large instrumental errors involved with the US measurement stations.
This article from the Pittsburgh Tribune-Review introduces the subject nicely. Helping along global warming By Bill Steigerwald Sunday, June 17, 2007 In January the National Oceanic & Atmospheric Administration (NOAA) and its good friends in the media trumpeted that 2006 was the warmest year on record for the contiguous United States. NOAA based that finding - which allegedly capped a nine-year warming streak "unprecedented in the historical record" - on the daily temperature data that its National Climatic Data Center gathers from about 1,221 mostly rural weather observation stations around the country. Few people have ever seen or even heard of these small, simple-but-reliable weather stations, which quietly make up what NOAA calls its United States Historical Climatology Network (USHCN). But the stations play an important role in detecting and analyzing regional and global climate change. More ominously, they provide the official baseline historical temperature data that politically motivated global-warming alarmists like James Hansen of NASA plug into their climate models to predict various apocalypses . So, lets look at the network!.
The National Climatic Data Center (NCDC) runs a network of 1221 Maximum Minimum Temperature Stations (MMTS) overseen by local National Weather Service offices. These comprise the United States Historical Climatology Network (USHCN). Critical to the accuracy and reliability of the network data is the placement of the MMTS units. Sites are to be well away from asphalt, concrete, buildings, and sources of moisture (e.g. lakes, water treatment plants). They should also be placed over grass rather than gravel or rock. A good quality rural site and placement of an MMTS.
This MMTS placement represents a worst case scenario. This unit, near a firehouse, is right next to an asphalt parking lot, a concrete walkway, and a building with air conditioning exhaust fans. Being an emergency facility, It is also next to a cell phone transmitter tower. Data from the MMTS units is often transmitted by wireless or cable, so radio interference can be a serious problem, especially with older equipment in some of the units, but the heat radiated by the concrete, asphalt, and air conditioners are the main problem. An extremely poor quality “ rural” site and placement.
http://wattsupwiththat.wordpress.com/2008/01/10/how-not-to-measure-temperature-part-46-renos-ushcn-station/ Reno’s USHCN station is particularly important due to it being part of the test cases of urban stations in the new USHCN2 scheme being implemented by NCDC. Reno’s steep temperature trend appears to be more of an urban heat island issue than a climate change issue. It shows up clearly as a hot spot in USHCN contours done by Steve McIntyre. As the above graph shows, the monitored temperature in Reno dropped when the ASOS sensor was temporarily moved to the south end of the runway in 1996 and 1997, away from the heat bubble at the north end (upper left figure), going back up when moved back to its original position.
An example of a long term, good quality, rural measuring site is that of Cedarville in extreme north eastern California. As the towns go, it has changed very little in 100 years. There is no Interstate highway nearby, and its off essentially in the middle of nowhere by itself, a self contained agrarian community. The long term temperature trend in Cedarville stands in stark contrast to those from areas that have experienced urban development like Reno. 1900 1940 1970 2000
At the time of the publishing of the initial study (you can find this and updated results at http://www.surfacestations.org/ ), Watts and collaborators had reviewed 27% of the network MMTS units. The results show that 16% had an error of more than 5 o C, 51% an error of more than 2 o C, and 16% an error of 1 o C or more. This shows that at least 67% or two thirds of the temperature measurements from these rural stations are significantly flawed. ( There are additional problems with MMTS units: visit the slideshow at the link above.) More stations have been studied, and the percentage of extremely biased sites has increased.
Global Temperature Measurements The media recently reported that 2007 was the second warmest year on record, after the 1998 El Nino peak. NASA recently had to admit that there was an error in the calculated US average temperature due to a mixing of old and recent data sets, and a statistical attempt to remove the heat island effect, that artificially inflated the US temperature by 0.15 o C after 2000. This may not sound like much, but it is about 25% of the warming that occurred during the 20 th century. This error comes from significant inconsistencies in the National Oceanic & Atmospheric Administration (NOAA) network of “rural” temperature monitoring stations that have been affected by poor placement, maintenance of sensor units, and urban growth around some previously urban sites. The US has the most extensive and best maintained temperature network in the world, especially in comparison to developing nations. So how reliable is the global data base? Temperature Anomaly ( o C) CO 2 (ppm) 1998 CO 2 temperature
James Hansen, a climate change alarmist, acknowledged the 0.15 o C error but claimed that it was minor and only applied to the US which covers only 2% of the Earth’s surface. However, US temperature measurements account for 50% of global measurements (see previous slide of global coverage), most measurements outside the US occurring in urban areas. This error has since been removed (click or scroll to see), but evidently there were even bigger errors, and the new record shows a very different result (next slide).
Steve McIntyre posted this data from NASA's newly published data set from Goddard Institute of Space Studies (GISS). These numbers represent deviation from the mean temperature calculated from temperature measurement stations throughout the US. http://data.giss.nasa.gov/gistemp/graphs/Fig.D.txt According to the new data published by NASA, 1998 is no longer the hottest year on record, 1934 is (during the “dust bowl” drought of the Great Depression ). NOAA said 2007 would be the 5 th hottest year on record. Maybe, maybe not! The temperature network “has a fever” and a bunch of unreliable thermometers. New Top 10 GISS Temperature deviations 8/7/2007 The old order of top 10 GISS temperature deviations Al Gore has been flying around the globe in his private jet telling anyone who will listen that ten of the hottest years in history came in the last eleven years (and that “the earth has a fever”) In August 2007, the following was published by Steve McIntyre, using recently reevaluated NASA temperature data for the US. NOAA has been critical of NASA’s reassessment, though its own assessment shows a similar trend.
NOAA temperature records showing that 1934 was the warmest year on record in the US. This was during the extended drought of the “dust bowl” era of the Great Depression (click). US temperature measurements account for 50% of global temperature measurements , representing the most extensive and best maintained network of stations, which is highly flawed.
Animated (brief delay) composite of US decadal high temperature records. Especially note the 1930’s in comparison with all other decades.
Quantifying the influence of anthropogenic surface processes and inhomogeneities on gridded global climate data Ross R. McKitrick and Patrick J. Michaels JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 112, D24S09, 2007 “ The standard interpretation of global climate data is that extraneous effects, such as urbanization and other land surface effects, and data quality problems due to inhomogeneities in the temperature series, are removed by adjustment algorithms, and therefore do not bias the large-scale trends. Our empirical model of the post-1980 interval embeds this assumption as a null hypothesis, and it is rejected at very high confidence levels.” “ Taken together, our findings show that trends in gridded climate data are, in part, driven by the varying socioeconomic characteristics of the regions of origin, implying a residual contamination remains even after adjustment algorithms have been applied”. Developed nations show larger temperature changes than underdeveloped nations, even after attempts to remove temperature bias due to the heat island effect and poor place- ment of rural MMTS as the above paper shows. The residual temperature increase is highest outside the US and shows that the heat island effect has not been fully removed.
US temperature record (left) versus global temperature anomaly (right), the fluctuation around the average. According to the graph on the right, the global temperature was mostly below the average before 1980. There is no doubt that the Earth has warmed since 1850 following a 450 year mini-Ice Age . The Thames River hasn’t frozen in a long time, and winters are not as cold as they were in the past, though there have been some record cold years since then. Growing seasons have increased about 1 day per decade since 1850, and many plants and animals have extended their range to pre- viously cooler regions. While the true magnitude of the temperature rise is in question, these temperature changes are not unprecedented by any measure, and fall well within the normal range of natural variability. The global temper- ature record of the last 150 years is statistically underrepresented both in area and time, and Is contaminated by a number of factors.
Smog is also affecting local and regional heating and temperature measurements.
Brown Clouds Not only is there a problem with measuring temperatures near the ground, but in some areas, the ground temperatures are elevated due to the presence of lower level brown “clouds”, actually smog layers and plumes of tiny soot and dark pollution particles (”aerosols”). Some aerosols produce an off-white “haze”, or they alter cloud droplets to make ordinary clouds more reflective. It has generally been assumed that these low level aerosols have a cooling effect. However, Ramanathan et al. directly measured heating in the lower tropo-sphere with three unmanned aerial vehicles (UAVs) in India. They found that brown “clouds” absorb radiation, heating the ground and the air of the lower troposphere (under 10,000 feet) leading to the loss of glaciers in the Himalayas.
Brown layer over central India and the sky over a typical city. Industrial emissions in India are not controlled like they are in the West. People burn wood and cow dung for cooking and heating, and burn their trash, including plastics, water bottles (popular because of the bad water), etc. Burning plastic releases toxic chemicals that are the main reason people die in fires (asphyxiation after inhaling toxic fumes). Add engine exhaust, and you get a toxic mixture of air that Indians refer to as “fog”; add high humidity and you get a smog explosion, which Indians call “heavy fog”. Photos courtesy of Dan Petersen of the Desert Research Institute; from a recent trip to India.
Nature 448 , 575-578 (2 August 2007) | doi:10.1038/nature06019; Received 2 February 2007; Accepted 13 June 2007 Warming trends in Asia amplified by brown cloud solar absorption Veerabhadran Ramanathan, Muvva V. Ramana, Gregory Roberts, Dohyeong Kim, Craig Corrigan, Chul Chung & David Winker Center for Clouds, Chemistry and Climate, Scripps Institution of Oceanography, UCSD, La Jolla, California 92037, USA NASA Langley Research Center, Hampton, Virginia 23681-0001, USA Atmospheric brown clouds are mostly the result of biomass burning and fossil fuel consumption. They consist of a mixture of light-absorbing and light-scattering aerosols 1 and therefore contribute to atmospheric solar heating and surface cooling. The sum of the two climate forcing terms—the net aerosol forcing effect—is thought to be negative and may have masked as much as half of the global warming attributed to the recent rapid rise in greenhouse gases. There is, however, at least a fourfold uncertainty in the aerosol forcing effect . Atmospheric solar heating is a significant source of the uncertainty, because current estimates are largely derived from model studies . Here we use three lightweight unmanned aerial vehicles that were vertically stacked between 0.5 and 3 km over the polluted Indian Ocean. These unmanned aerial vehicles deployed miniaturized instruments measuring aerosol concentrations, soot amount and solar fluxes. During 18 flight missions the three unmanned aerial vehicles were flown with a horizontal separation of tens of meters or less and a temporal separation of less than ten seconds, which made it possible to measure the atmospheric solar heating rates directly. We found that atmospheric brown clouds enhanced lower atmospheric solar heating by about 50 percent . Our general circulation model simulations, which take into account the recently observed widespread occurrence of vertically extended atmospheric brown clouds over the Indian Ocean and Asia, suggest that atmospheric brown clouds contribute as much as the recent increase in anthropogenic greenhouse gases to regional lower atmospheric warming trends. We propose that the combined warming trend of 0.25 o C per decade may be sufficient to account for the observed retreat of the Himalayan glaciers.
Satellite image of China engulfed in brown clouds (above) and pictures of Beijing (right) with smoggy skies, representative of China’s largest cities. However, these are the conditions in most Chinese cities, large and small, due to China’s enormous con-sumption of coal (it is also used for cooking and home heating). The athletes who went to Beijing for the Olympics got to experience this first hand, though the Chinese shut down several industrial plants and banned most cars from the streets prior to the event.
So in addition to the urban heat island effect, there is warming near the ground and in the lower troposphere because of suspended soot particles, and if Ramanathan et al are correct, the effect is much larger than assumed by the IPCC. The US and Europe have done much to reduce black carbon emissions from automobiles and power plants. China and many other countries have done virtually nothing to reduce these emissions. Currently, over 75% of black carbon aerosols in North America come from Asia (see link below). China has a goal of building 500 coal fired electrical generating plants in the next 20 years or so (they break ground on a new one about every 15 days!). If heating at the ground due to black carbon is so prevalent in developing countries, how reliable are temperature data from these countries which are used to calculate the global average. http://scrippsnews.ucsd.edu/Releases/?releaseID=777 And Trans-Pacific transport of black carbon and fine aerosols (D < 2.5 mm) into North America: Hadley el al. JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 112, D05309, doi:10.1029/2006JD007632, 2007
20th-Century Industrial Black Carbon Emissions Altered Arctic Climate Forcing Joseph R. McConnell,1* Ross Edwards,1 Gregory L. Kok,2 Mark G. Flanner,3 Charles S. Zender,3 Eric S. ScienceExpress 9 August 2007 / Page 1 / 10.1126/science.1144856 Black carbon (BC) from biomass and fossil fuel combustion alters chemical and physical properties of the atmosphere and snow albedo, yet little is known about its emission or deposition histories. Measurements of BC, vanillic acid, and non–sea-salt sulfur in ice cores indicate that sources and concentrations of BC in Greenland precipitation varied greatly since 1788 as a result of boreal forest fires and industrial activities. Beginning about 1850, industrial emissions resulted in a seven-fold increase in ice core BC concentrations with most change occurring in winter. BC concentrations after about 1951 were lower but increasing. At its maximum from 1906 to 1910, estimated surface climate forcing in early summer from BC in Arctic snow was about 3 W/ m 2 , eight times typical pre-industrial forcing. This study shows that direct absorption of heat by black carbon soot on snow surfaces contributed significantly to Arctic and Greenland ice melt, well before increases in greenhouse gases and air temperature (next time it snows, throw a charcoal briquette on the snow and watch what happens). A previous slide mentioned that Greenland experienced more melting than today prior to 1940. Most of the black carbon from this period came from the US and Canada’s coal consumption which declined after 1940. Today it comes mostly from Asia.
The “consensus”, drawn from climate models, is that CO 2 may cause catastrophic warming by the end of the century. How much warming is expected?
The IPCC has used models to make a number of predictions of warming scenarios which vary in their emphasis on warming due to carbon dioxide and other factors (left). According to the average of the models (black solid line), a warming of about 1.8 o C is predicted over the next 80 years or so. However, when compared with actual observations (right), nearly all of these models overestimate the degree of warming, some severely. The currently “Observed Trend” can be derived from extending the “Observed Warming” of 1975-2000 into the future, assuming that this warming rate remains constant. Improved models will have to wait for the temperature measurement issue to be corrected, along with new estimates of the degree of ground and lower tropospheric warming due to black carbon.
An expanded view of the previous graph and the graph of the last ten years of global temperature data further confirm the lower estimate of global warming for the future, if the warming continues. The temperatures have been matched with the year 2000 values from each set. 1998 ↓ Temperature Anomaly ( o C) CO 2 (ppm) 1998 CO 2 temperature
Do increases in CO 2 explain the warming of the 20 th Century?
Prior to the Industrial Revolution, the mean atmospheric concentration of CO 2 was about 278 parts per million by volume of air (ppmv or ppm). The level of CO 2 did not increase significantly until after World War II, but then began to rise at a much faster rate. The IPCC predictions of temperature increase shown on the right assume a doubling of CO 2 will lead to a 1.5 to 4.5°C increase by the end of the century, assuming emission rates stay constant or increase to some extent. The current level of CO 2 is about 385 ppm, about 70% of the doubled amount of 556 ppm. The temperature has only increased by about 1.0°C over the last 150 years or so. This implies that only a 1.5°C increase is expected for a doubling of CO 2 , 1.9°C if you use just the last 30 years of data (an upper limit value). This assumes that the true temperature increase has been accurately measured, that the predicted effects of CO 2 are correct, and that the current trend will continue.
So does carbon dioxide have any affect on the climate? It should have some effect, but the current predictions of climate response to increased CO 2 do not take into account the response of the coupled ocean-atmosphere, which is difficult to characterize, or cloud formation in response to warming and solar activity. Considering the recent leveling off of global temperature for almost a decade (with CO 2 continuing to rise), along with a similar level of relatively constant solar activity, it has been suggested that the actual sensitivity of the atmosphere is much smaller than models predict. Shaviv and Veizer predict that the sensitivity of the climate is between 1 to 1.5°C per CO 2 doubling, compared with the 1.5 to 4.5°C increase currently predicted by IPCC models. There are compelling reasons to that favor the lower estimate. http://www.sciencebits.com/ClimateDebate Temperature Anomaly ( o C) CO 2 (ppm) 1998 CO 2 temperature 1998 ↓
Comparing previous slides with data from Oak Ridge National Laboratory (ORNL), it is clear that global CO 2 levels began rising before the Industrial Revolution, following a pulse of solar activity between 1700 and 1800. Solar activity briefly fell for about 30 years (1810-1840) and then increased again. The ORNL data shows that anthropogenic emissions did not significantly affect the rate of CO 2 increase until after World War II. As previous slides showed (click or scroll), the global temperature decreased (arrow) for over 30 years, from about 1940 to 1975, as CO 2 emissions and concentrations rose significantly (arrow), in utter contradiction to the assumption that CO 2 drives temperature change.
7. Marland, G., Boden, T. A., and Andres, R. J. (2007) Global, Regional, and National CO2 Emissions. In Trends: A Compendium of Data on Global Change . Carbon Dioxide Information Analysis Center,Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, TN, USA, http://cdiac.ornl.gov/trends/emis/tre_glob.htm 8. Soon, W. (2005) Geophysical Research Letters 32 , 2005GL023429. 9. Hoyt, D. V. and Schatten, K. H. (1993) J. Geophysical Res. 98 , 18895-18906. Arctic surface temperature compared with total irradiance as measured by sunspot cycle amplitude, sunspot cycle length, solar equatorial rotation rate, fraction of penumbral spots, and decay rate of the 11-year sunspot cycle (8,9). Solar irradiance correlates well with Arctic temperature, while hydrocarbon use (7) does not correlate. As in the previous slide, it is clear, in the case of Arctic air temperatures, that changes are correlated to solar activity and not CO 2 . Do these and previous results indicate that CO 2 has no effect on the temperature? Carbon dioxide is a greenhouse gas and should have some effect. temperature CO 2
A recent paper on the lack of tropospheric warming. ABSTRACT: We examine tropospheric temperature trends of 67 runs from 22 ‘Climate of the 20th Century’ model simulations and try to reconcile them with the best available updated observations (in the tropics during the satellite era). Model results and observed temperature trends are in disagreement in most of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. In layers near 5 km, the modeled trend is 100-300% higher than observed: above 8 km, modeled and observed trends have opposite signs. These conclusions contrast strongly with those of recent publications based on essentially the same data. This article addresses the disagreement between actual tropospheric temperature measurements and IPCC climate models which predict warming based on increased levels of CO 2 . The lack of observed warming seriously questions the validity of the models and their assumptions about the sensitivity of the atmosphere to increased levels of CO 2 . Most of the warming has apparently occurred at the ground, contrary to the models.
Update: ( upper figure ) The latest assessment of global temperatures drawn from microwave sounding unit (MSU) satellite measurements from the University of Alabama Huntsville and surface temperature data from the UK’s Hadley Centre CRUT3 surface temperature database in comparison with CO 2 levels. MSU measurements are considered to be a very accurate measurement of temperature in the lower troposphere, the level of the atmosphere closest to the ground. ( lower figure ) Temperature trend of the last 5 years. NASA recently reported that GISS temperature data indicates that 2008 was the coolest year since 2000. http://data.giss.nasa.gov/gistemp/2008/
The CO 2 driven climate models of the IPCC emphasize positive “feedbacks”, the response of the climate to some force of change or “forcing”, but consider- Ing the relative long term stability of the Earth’s climate, negative feedbacks must be present or the Earth would be in a perpetual freeze or roast.
A negative feedback response to a positive forcing (heating) is such that an offsetting cooling trend occurs. This could take the form of an increase in cooling cloud cover due to increased evaporation of water vapor from the oceans. A positive feedback response to heating could be an increase in heat trapping water vapor, warming the atmosphere and the oceans, causing more evaporation and a further increase in water vapor, amplifying the response.
The self-appointed scientist Al Gore, in a recent interview with Newsweek Magazine, stated that CO 2 was not responsible for the majority of global temperature increase. This comes after years of Al Gore telling us that “the debate is over” and that “the science is settled”. This also comes after the revelation that the global temperature has been constant or in slow decline for the last 10 years while CO 2 has increased by about 4-5%. CO 2 is now suddenly responsible for about 43% of the warming, but this is according to some possible model scenarios by Shindell (2009) which are only suggestions and have yet to be corroborated by any studies or data. (Shindell, D.T. et al. Science 326, 716-718 (2009). http://www.tonightshowwithconanobrien.com/video/clips/al-gore-pt4-111209/1175411/ Gore appeared on the Conan Obrien Show on Nov. 12, 2009 to discuss his new book about climate change. While promoting geothermal power, Gore educated Conan and the audience about the Earth’s mantle, claiming the temperature of the Earth’s core is “several million degrees” and “the crust of the earth is hot”. Scientist Gore doesn’t have a clue about science. The Earth’s temperature “2 kilometers or so down” is between 500 o C and 900 o C, and the core has a temperature of about 5700 o C. Ask any high school or freshman geology student!
Can Kyoto Protocol agreements and their successors (Copenhagen) significantly reduce CO 2 missions?
2008 US Senate Committee on Environment and Public Works Senate Majority Committee Members Barbara Boxer (Chairman) Max Baucus Joseph I. Lieberman Thomas R. Carper Hillary Rodham Clinton Frank R. Lautenberg Benjamin L. Cardin Bernard Sanders Amy Klobuchar Sheldon Whitehouse Senate Minority Committee Members James M. Inhofe John Warner George V. Voinovich Johnny Isakson David Vitter John Barrasso Larry E. Craig Lamar Alexander Christopher S. Bond http://epw.senate.gov/public/index.cfm?FuseAction=Minority.SenateReport#report “ Kyoto” would result in a 0.07 o C decrease out of a possible 0.9 -1.2 o C increase by 2050. Such a small decrease would not be meaningful or verifiable in light of the model uncertainties and natural variability. In 1998 during the Clinton administration, the Kyoto Protocol was signed by then Vice President Al Gore, but it was rejected by the Senate (who ratifies treaties) by a margin of 95-0, by Democrats and Republicans.
Here’s a reality check. The UN and the IPCC seem to have their attention misdirected when it comes to future emissions, focusing their attention on the US, where emission rates have been in decline, instead of on China, India, and Asia where emission rates are increasing dramatically and dominating current increases. China’s emissions, linked to its economy, are growing at such an enormous rate, that the Kyoto Protocol and recent international agreements (Bali) are already rendered as virtually meaningless. What happens in the future will be dictated by China, India, and other countries in Asia, with reductions in the West having little effect. http://online.wsj.com/public/article/SB116718773722060212-mNaUQDcxmDkPPEoj1XbxtV_MgCs_20070423.html
A bigger reality check. The US Department of Energy (DOE) Energy Infor- mation Administration (EIA) recently issued its forecast of global energy use and consumption. If the US, the EU, India, and the rest of the world stopped emitting CO 2 all together, CO 2 levels would continue to rise due to China alone. This DOE website has some sobering facts that all Americans interested in reducing pollution, or climate change, should review. http://www.eia.doe.gov/oiaf/ieo/world.html
But the consensus of the IPCC is that the world is threatened with catastrophic climate change due to rising CO 2 levels. Doesn’t CO 2 have some effect?
The effects of CO 2 are likely overestimated as the previous slides showed.
Most of the observed warming of the atmosphere has occurred at the ground, contrary to warming models which emphasize CO 2 .
Ground warming has been affected by the urban heat island effect, “brown clouds” (smog), and soot.
Current temperature measurements are problematical due to poor placement of temperature monitors and instrumental biasing due to location and other factors (see surfacestations.org).
Mild warming is occurring, and is evident in the troposphere, in spite of errors at the ground, but only at 1/3 the level predicted by most models.
Once warming begins, the addition of small amounts of greenhouse gases is expected to weakly amplify the warming, but not to the extent predicted by the currently crude models which do not include feedback mechanisms between the clouds and the atmosphere (to be discussed shortly).
So, regardless of inaccurate temperature measurements and overestimates of CO 2 effects, what is causing the warming?
The models used by the IPCC characterize the Sun as a constant source of energy, using the last 25 years of solar data, and concluding that the Sun shows only about 0.1% variability. This assumption ignores the fact that the Sun’s intensity has increased by about 1% over the last 200 years (add the arrows on the right), coincident with the end of the Mini Ice Age. A recent paper that considers the pre-industrial variability of the Sun in relation to temperature changes, shows that 50% or more of the temperature increase since 1900 (about 1 o F) is due to increased solar intensity (“irradiance”). There is much more to the Sun’s influence than just solar irradiance (infrared, visible, & UV radiation). Phenomenological reconstructions of the solar signature in the Northern Hemisphere surface temperature records since 1600. N. Scafetta and B. J. West JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 112, D24S03, doi:10.1029/2007JD008437, 2007 Daly (2005) 1 1 1 1 1
2008 Update A new paper (N. Scaffeta. B.J. West, J. Geophys. Res . 112, D24S03 (2007)) reviewed in the March issue of Physics Today shows that nearly 69% of the temperature change over the last 150 years can be attributed to changes in total solar irradiance (TSI) or solar intensity. The article shows that the models used by the IPCC incorporate averaging techniques that mask the actual solar variability and the clear signature of solar variability in the global temperature record. It also emphasizes the fact that the global temperature hasn’t changed since 1998, reflecting solar activity. Using recent satellite and standard proxy data, it is also shown that there has not been as much warming as the IPCC models indicate.
Trends in natural and human influence relevant to Earth’s climate during the past four centuries (Lean and Rind, 1996). Compared are annual averages of (a) estimates of the solar total radiation; (b) variations in the amount of volcanic aerosols derived from an index of known eruptions; and (c) the concen-tration of CO 2 in the atmosphere. The decade-averaged estimate of the Earth’s temperature, shown in (d), combines continuous instru-mental records from the most recent 150 years with a less certain reconstruction based on various climate indicators. It is clear from comparing (a) and (d) that the surface temperature change follows changes in the Sun, not CO 2 . Sun’s activity Krakatau 1815
So the Sun’s increased activity appears to explain much of the heating of the 20 th Century. But what about the Greenhouse Effect? Are there other factors?
The Sun heats the Earth and the Earth attempts to radiate that heat back into space. However, some molecules in the atmosphere (water vapor, carbon dioxide, ozone, methane, and others) absorb and reemit infrared radiation, or heat, back to the ground, heat that would normally escape into space. This is the so called greenhouse effect. Water vapor is by far the most abundant and most important of the greenhouse gases with variable concentrations of 0.1-3% or 1000 -30,000 parts per million (ppm) compared with carbon dioxide at 380 ppm, the next most abundant greenhouse gas. Water vapor has roughly 10 times the greenhouse effect of CO 2 . Without greenhouse gases and the greenhouse effect, the Earth would be frozen with an average temperature of about -1 o F or -18 o C. However, without the cooling effect of clouds, the greenhouse effect would heat the Earth to an average temperature in excess of 140 o F (hot as the Sahara desert in summer) instead of the moderate temperature that is presently the case. Everyone has experienced the cooling effect of a cloud on a sunny day when it blocks the Sun, in winter or summer. Clouds can also trap heat (think of how hot the air remains on a summer evening with clouds present, or how quickly the evening air cools in their absence). Clouds are the critical moderator of the greenhouse effect, giving the Earth a temperate climate. The Greenhouse Effect
FIGURE SPM-2. (IPCC report) Global-average radiative forcing (RF) estimates and ranges in 2005 for anthropogenic carbon dioxide (CO 2 ), methane (CH 4 ), nitrous oxide (N 2 O) and other important agents and mechanisms, together with the typical geographical extent (spatial scale) of the forcing and the assessed level of scientific understanding (LOSU). The net anthropogenic radiative forcing and its range are also shown. These require summing asymmetric uncertainty estimates from the component terms, and cannot be obtained by simple addition. The recent IPCC report points to CO 2 as the main cause of warming, the other factors, both warming and cooling, apparently cancelling each other. One glaring omission is the estimated effect of water vapor, though the report says “Water vapour changes repre-sent the largest feedback af-fecting climate sensitivity and are now better understood than in the TAR ( last report )”. If so, where is the estimate? Also missing is commentary on IPCC greenhouse models that predict warming due to CO 2 should show up most strongly in the mid-troposphere, be-tween approximately 10,000-40,000 feet. This predicted warming has not appeared. water vapor? Fourth Assessment Report (FAR)
Both water vapor and CO 2 act as greenhouse gases (GHGs), water vapor having about 10 times the effect of CO 2 , other GHGs being weaker than CO 2 . As the climate warms due to increases in GHGs, so do the oceans, and so does the rate of evaporation of water from the oceans, resulting in more water vapor in the air. This is an example of a “positive” feedback (warming = more vapor = more warming). If the increased water vapor forms more low level clouds, this has a cooling effect, reducing the temperature, and possibly removing water vapor from the air as precipitation. This is an example of a negative feedback (warming = more vapor = more low clouds = cooling). The Earth’s coupled land-ocean-atmosphere system has both positive and negative feedbacks, many of which are poorly understood and affect the accuracy of climate change models. Feedbacks water vapor water vapor From the IPCC report: “The average atmospheric water vapor content has increased since at least the 1980s over land and ocean as well as in the upper troposphere. The frequency of heavy precipitation events has increased over most land areas, consistent with warming and observed increases of water vapor. Water vapor changes represent the largest feedback affecting climate sensitivity. Cloud feedbacks remain the largest source of uncertainty.
Clouds play a critical role in the Earth’s climate, yet they are one of the most poorly characterized aspects of the atmosphere. Clouds have different effects depending on where they are in the atmosphere. Generally, low level clouds cool, mid-level clouds are neutral, and high clouds warm the atmosphere. The effects of clouds are, at best, crudely put into atmospheric models, without accounting for how they respond to, or drive climate change. Clouds and Climate So what factors affect cloud formation? The Earth provides particles and the Sun provides heat to evaporate water which then condenses on the particles to form cloud droplets. But recently it has become clear that the Sun also affects the formation of clouds by controlling ionization of the atmosphere, and that ionization assists in the formation of particles.
Cloud Condensation Nuclei (CCN) Microscopic particles in the atmosphere are necessary to form cloud droplets in the Earth’s atmosphere. These little particles are referred to as cloud condensation nuclei (CCN) because they attract water molecules. They are mostly microscopic particles of clay that get lofted by the wind, but can also include soot particles, liquid aerosols, and possibly tiny bits of organic material. Larger CCN are better at forming drops than smaller ones, so any process that helps particles grow or stick together, makes them better CCN. An effective way to get little particles to stick together is to electrically charge them. Static charging of clothes in a dryer causes them to stick together, so it works for large objects too. Micrograph of CCN next to a fiber. The just-visible dots are “ultra-giant” CCN (most are much smaller); the larger spots are residues left by evaporated droplets. Droplets scavenge particles and chemicals from the air, concentrating them. CCN Cloud Droplets water vapor ∙ ∙ ∙ ∙ ∙ ∙ ∙ ∙
The Sun: The Climate Driver in Unexpected Ways
Directly, ultraviolet (UV) radiation knocks electrons off of atoms and molecules, creating ions which populate the ionosphere and slowly drift downwards toward the surface. UV radiation also splits molecules, both creating and destroying ozone, which absorbs much of the UV radiation before it reaches the ground. Indirectly, the Sun has a magnetic field and a solar wind of energetic protons and electrons which affect the atmosphere. The Sun controls ionization of the atmosphere both directly and indirectly
Besides radiation, a solar wind (left figures), consisting mainly of electrons and protons, blows from the Sun like a gale. The Sun also has a magnetic field, and the Earth orbits within both of these (center figures). The Sun’s magnetic field and solar wind are variable, and their strength is indicated by sunspot activity (lower right) which has been monitored for almost 400 years since the invention of the telescope.
Sunspots are produced in a 22-year cycle by the Sun’s magnetic field.
Sunspots vary over an 11-year half
During an 11-year cycle, the average
number of sunspots increases and
decreases, with reversed magnetic
polarities from one 11-year cycle to
Two such cycles make up the 22
year solar cycle.
Sunspots are regions of cooling on the Sun’s surface that look dark to the human eye,
but are actually ten times brighter than a full moon. However, the region around them
is even brighter, hence they appear dark to the naked eye (the eye is easily fooled).
The frequency of sunspots is a direct indicator of overall solar activity, specifically the
amount of radiation emitted and the strength of the solar wind.
Solar activity has been very high in the 20th century compared to the last 400 years during which sunspots have been observed. Sir William Herschel was the first to seriously consider the sun as a source of climate variations, over two centuries ago. He noted a correlation between the price of wheat, which he presumed to be a climate proxy, and sunspot activity: “ The result of this review of the foregoing five periods is, that, from the price of wheat, it seems probable that some temporary scarcity or defect of vegetation has generally taken place, when the sun has been without those appearances which we surmise to be symptoms of a copious emission of light and heat .” Sir William Herschel, Phil. Trans. Roy. Soc. London, 91, 265 (1801). So how do sunspots and the solar magnetic field affect climate? By controlling the flux of cosmic rays that reach the Earth. Observed sunspots and simulated magnetic flux for the last dozen 11 year solar cycles. Modern measurements of magnetic flux have allowed scientist to gauge the relation between sunspots and magnetic flux in the past.
When massive stars die, they explode as supernovae, briefly releasing almost as much energy as all of the stars in the galaxy in which they reside. Many galaxies have active cores or nuclei, with massive black holes that regularly gobble up entire stars, releasing enormous amounts of energy. These processes fill space with a relatively steady stream of very high energy “cosmic rays”. Physicists distinguish between the relatively low energy cosmic rays from the Sun (the solar wind) versus the more energetic galactic cosmic rays (GCRs). Supernova explosion Active galactic nucleus harboring a super massive black hole Cosmic Rays
Cosmic Rays and Earth Most cosmic “rays” are the nuclei of atoms, ranging from the lightest to the heaviest elements in the periodic table. Some of these have energies of a thousand to a million times larger than the most energetic particles produced in the worlds most powerful particle accelerators. When these particles encounter the Earth’s atmosphere, they collide with air molecules. They have such high energies that the nuclei of oxygen and nitrogen atoms are fragmented by the collisions, creating a “shower” of smaller particles. These fragments have so much energy that they go on to collide with other air molecules, creating an avalanche of particles (right), some of which penetrate the atmosphere all the way to the ground. You are constantly bombarded by these particles, whether outdoors, in your house, or even under ground, one passing through the area of your palm approximately every second
Lightning producing clouds get electrically charged by collisions between cloud drops and ice crystals. This charging process creates an electric field. For decades scientists have studied these electric fields and found they are far too weak to initiate lightning spontaneously. How lightning got started was a real mystery until recently. In the last ten years, it has become clear that cosmic rays are the trigger for lightning. When a cosmic ray passes through a well charged cloud, it creates an ionized or charged path. Like a wire, this briefly connects the charge to the ground or to another part of the cloud, and a lightning bolt is born. The cosmic rays reveal their presence by the emission of x-ray and gamma ray radiation just before the strike. Cosmic Rays and Lightning Initiation The combined discharge arising from runaway break- down triggered by a cosmic-ray extensive atmospheric shower (EAS) is shown schematically during a thunderstorm at the Tien-Shang Mountain Scientific Station in Kazakhstan with its Y-shaped gamma-ray detectors. The discharge occurs where the cloud’s electric field exceeds a critical value of Ec and produces radio bursts as well as gamma- and other emissions. May 2005 Physics Today
Cosmic rays ionize the atmosphere, liberating electrons and ions. Experiments performed at the Danish National Space Center has found that the production of aerosols (tiny solid or liquid particles) in a sample atmosphere with gases, such as sulphuric acid and water vapor, depends on the amount of ionization. In 1959, the late Edward Ney of the U. of Minnesota suggested that any climatic sensitivity to the density of tropospheric ions would immediately link solar activity to climate. This is because the solar activity modulates the flux of high-energy particles coming from outside the solar system. These particles, the cosmic rays, are the dominant source of ionization in the troposphere. http://www.spacecenter.dk/research/sun-climate/experiments/the-sky-experiment Cosmic Rays and CCN stops
Clouds have a large cooling effect on the atmosphere that depends on where and how often they occur. Current climate models only include the effects of the small variations in the direct solar radiation, S, but during an 11 year solar cycle, variations in the cosmic ray flux can lead to a 3% variation in cloud cover. This is estimated to cause a variation of cooling of 0.8-1.7 W/m 2 , about as large as the present estimated warming from CO 2 . The intensity of sunlight at the position of the Earth is about 1360 W/m 2 , equal to about thirteen 100 Watt light bulbs shining on a 3ft x 3ft table. This energy is intercepted by the disk of the Earth with area π R 2 , but it is distributed over the sphere of the Earth with area 4πR 2 , so the intensity near the ground is only 1/4 as much as in space. About 1/3 of the Sun’s energy is reflected back into space due to the Earth’s overall reflectivity or albedo, resulting in about 235 W/m 2 of net heating. (estimated )
The amount of cosmic rays reaching Earth is largely controlled by the Sun. Many scientists believe the Sun's influence on the Earth's global climate has been underestimated, and that a significant part of global warming recorded in 20 th Century has its origin in changes in solar activity, not the increase in fossil fuel produced greenhouse gases. So how does the Sun control cosmic rays reaching the Earth? Cosmic Rays and Clouds
Cosmic Rays in the Solar System Cosmic rays are deflected by the Sun’s magnetic field and they lose energy by colliding with the solar wind, the stream of ions and electrons blowing from the solar corona at about 400 km/sec. The intensity of cosmic rays reaching the Earth decreases when the Sun is active, and increase when it is quiet. Spacecraft venturing out towards the boundary of the solar system have found that the intensity of galactic comic rays increases with distance from the Sun where the solar wind and magnetic field are weaker. As solar activity varies over the 11 year solar cycle, the intensity of cosmic rays hitting the Earth varies oppositely to the sunspot number, i.e cosmic rays and solar activity are “anti-correlated”: more solar activity equals less cosmic rays. Cosmic rays affect the formation of clouds, most importantly, low level clouds. So low cloud cover and cooling, or warming, vary with the Sun’s activity and cosmic ray intensity.
The link between solar activity, sunspots, and cosmic rays is clearly revealed in the above figure. When the Sun is “active”, sunspot number is up, total irradiance (including UV) is up, cosmic rays are down, reduced by the Sun’s increased solar wind and stronger magnetic field. Low level clouds are reduced, and warming occurs. When the Sun is “quiet”, sunspots are down, irradiance is down, and cosmic rays are up. Low level clouds are up, and cooling occurs.
The changing solar activity is responsible for a varying solar wind strength. A stronger wind will reduce the flux of cosmic rays reaching the Earth, since a larger amount of energy is lost as they propagate up the solar wind. Since cosmic rays dominate the tropospheric ionization, an increased solar activity will translate into a reduced ionization, and as shown above, also to a reduced low altitude cloud cover. Since low altitude clouds have a net cooling effect (their "whiteness" is more important than their "blanket" effect), increased solar activity implies a warmer climate, due to increased radiation and decreased low level cloud cover. The cosmic ray link between solar activity and the terrestrial climate Cosmic rays have been directly monitored at the ground for many decades, and satellites have provided over 20 years of low cloud observations. A clear connection is seen when these are compared. When cosmic rays (red) are down, low cloud cover (blue) is down. When cosmic rays are up, low cloud cover is up. The atmosphere cooled sharply following the warm El Nino of 1998. http://physicaplus.org.il/zope/home/en/1105389911/1113511992_en
The Sun–cosmic ray–cloud connection is a new climate issue that was actually suggested several decades ago by physicists who understood the effects of cosmic rays on molecules and particles, and now there is extensive geological data that supports the hypothesis. We are gaining a better understanding of the Sun’s influence, but there is more to learn. It will take time for this new information to disseminate through the climate change community, and some will reject it in favor of outdated assumptions. It makes sense from physics that the Sun’s activity would affect the Earth in more ways than as just a heat source. The Sun’s dominant influence cannot legitimately be discounted.
Here is a new paper of interest just published in Physical Review Letters. Correlation between Cosmic Rays and Ozone Depletion Q.-B. Lu Department of Physics and Astronomy, University of Waterloo, Waterloo, ON, N2L 3G1, Canada Abstract: This Letter reports reliable satellite data in the period of 1980–2007 covering two full 11-yr cosmic ray (CR) cycles, clearly showing the correlation between CRs and ozone depletion, especially the polar ozone loss (hole) over Antarctica. The results provide strong evidence of the physical mechanism that the CR driven electron-induced reaction of halogenated molecules plays the dominant role in causing the ozone hole. Moreover, this mechanism predicts one of the severest ozone losses in 2008–2009 and probably another large hole around 2019–2020, according to the 11-yr CR cycle. Percentage variations of CR flux (solid magenta line) and annual mean total O3 (data points + lines) measured at two Antarctic stations during the period of 1990–2007.
The correlation between solar variability and temperature for more than a century, and the recent discovery of the link between cosmic rays, clouds, and solar activity, appear to explain most of the recent global temperature variation. But what about past and future climate? The correlation with past climate is relatively clear because cosmic rays leave a long term record of activity in ice cores and sediments that can be compared with other climate proxies.
When cosmic rays collide with atoms in the Earth’s atmosphere, they create unique isotopes of elements that do not occur naturally in the planet’s geology. These isotopes end up in glacial ice and sea and lake sediments. Measurements of the variability of the these unique “cosmogenic” isotopes give us a picture of past solar activity. Direct observations of sunspots dating back to the invention of the telescope con-firm the solar activity/climate connection for almost 400 years. But the cosmogenic isotope record goes much farther back, and is used to infer temperatures and climate change in the distant past ranging from thousands to millions of years. Reversed scales; beryllium (left) decreases upwards when sunspots (right) increase upwards, signaling a decrease in cosmic rays, and vice versa.
Scientists say that a ten-second burst of gamma rays from a massive stellar explosion within 6,000 light years from Earth could have triggered a mass extinction 450 millions of years ago (Ordovician extinction) killing 60 percent of all marine invertebrates. In this artist's conception, we see the gamma rays hitting the Earth's atmosphere. The gamma rays initiate changes in the atmosphere that deplete ozone and create a brown smog of NO 2 , cooling the Earth. (Image: NASA) Cosmic Rays as a Climate Driver? Many climate scientists who are convinced of the role of CO 2 reject or are skeptical of the role of the Sun and cosmic rays in recent climate change. However, the role of cosmic rays in lightning is well established, and a high degree of correlation between past climate events and the record of cosmogenic isotopes is found in a number of independent geological proxies; these include ice cores, ocean and lake sediments, stalactites and stalagmites, and ancient ground water. The 11 year and longer duration solar cycles often stand out clearly in the geologic record. However, even this record is punctuated by random events indicating sudden climate change, possibly due to nearby supernovas, including one as recent as 2 million years ago. http://www.sciencedaily.com/releases/2002/01/020109074626.htm http://www.sciencedaily.com/releases/2005/04/050411101721.htm
Can the Earth Really Be Severely Affected by Space “Weather”? Absolutely! On Dec. 27, 2004, a special variety of neutron star known as a “magnetar” emitted a blast of gamma rays so powerful that it briefly altered Earth's upper atmosphere. The magnetar did this from a distance of 50,000 light years from Earth, from the other side of the Milky Way galaxy. "Had this happened within 10 light-years of us, it would have severely damaged our atmosphere and possibly have triggered a mass extinction," said Bryan Gaensler of the Harvard-Smithsonian Center for Astrophysics (CfA). http://www.space.com/scienceastronomy/bright_flash_050218.html
Solar activity (lower figure), increasing since around 1700, rose, declined, and then rose again in the second half of the 20 th Century, peaking around 1980, and remaining at that level since then (upper left). The temperature increase has slowed considerably since then (upper right), and since the strong El Nino event of 1998, the temperature has remained constant (even slightly declining), while CO 2 levels continue to climb. If the temperature remains constant or declines while CO 2 continues to rise, as occurred from 1945-1975, this will prove the assertion that CO 2 makes a minor contribution to warming. Back to Earth: So what will happen in the near future or the next century?
In addition to the 11 and 22 year solar cycles, there are longer cycles which are approximately multiples of these. The Gleissberg cycle, about 77-84 years in length, appears to involve about 7 consecutive 11 year cycles and is evident in the lower figure ( ≈ 1740-1820, 1830-1910, 1930-2010). All solar cycles are approximate, varying in terms of intensity and duration. Solar physicists have recently come to understand that the cycles are not independent of each other. Intense and quiet cycles usually come in clusters, and if one wants to predict the next cycle, the last several cycles must be considered. Following several decades of solar observations with both satellite and ground based instruments, advances in computer models of the Sun have given some solar physicists hope that reasonably accurate predictions of solar activity might be possible. Right now there are two competing models. One predicts another strong cycle like the last one. The other model predicts a downturn in solar activity, with cooling expected. The new cycle was expected to start in mid to late 2007. So what will happen in the near future or the next century?
When the new solar cycle starts, it will take about 5 years to reach its peak. If this peak is less than previous ones, it may be a sign that the Sun’s activity is declining after several active cycles, i.e. the end of the current Gleissberg cycle. But then again, solar activity could remain high for another cycle. It should be remembered that around 1645, the beginning of the sunspot “Maunder Minimum”, the Sun’s output plummeted, and the Earth was suddenly in a deeper freeze than was already underway during the so called “Little Ice Age”. This was followed by another less intense cooling in the early 1800’s, when Napoleon’s forces froze in Russia. Both of these were “a second ago” in terms of geologic time, and solar and climate variability on this time scale is the norm, not the exception. However, a reduction in solar activity would not likely be indicative of a new Maunder Minimum. http://en.wikipedia.org/wiki/Maunder_minimum The Near Future: Warmer, Stable, or Cooler?
(January, 2009) The current solar cycle is now confirmed to be considerably late (nearly 15 months or more) and weaker than expected. It appears that the lower estimate of solar activity is likely to be the case, and a global cooling trend may already be underway. The graphs above show the high and low predictions (red curves) of solar sunspot activity previously mentioned, in addition to radio emissions, another indicator of solar activity. Both predicted curves were expected to reflect the current solar activity, but beginning in mid to late-2007. The next 2 to 4 years may reveal that we are at the end of the current Gleissberg cycle, with solar activity declining, along with the temperature, reconfirming the Sun’s dominant influence on climate. http://www.swpc.noaa.gov/SolarCycle/ (update (click): its official, since May 2009, NOAA has stated that the low prediction is correct!) Update to The Near Future: Warmer, Stable, or Cooler?
This is addressed in the dvd “The Great Global Warming Swindle” produced in the UK in 2007, and in the book “Melt-down” by Patrick Michaels, which has the revealing subtitle of “The Predictable Distortion of Global Warming by Scientists, Politicians, and the Media”. After three decades of cooling from 1945-1975, followed by warming into the 1980’s, a few scientists became convinced that CO 2 might have something to do with it. Advances in computing and the emergence of modeling hinted at the possible role of CO 2 and other factors in climate change. Starting in the 1990’s research proposal announcements advertised funding was available for the purpose of finding anthropogenic causes of global warming, and so the hunt began. One powerful motivation for scientists is continued funding for research, so a bias was built into the atmospheric community to find or at least infer that just about every change was possibly due to anthropogenic causes. This kind of speculation, especially if it involves dire predictions, is great material for the evening news; and politicians, who are usually scientifically ill informed, are now in a frantic race to see who can best appease the global warming alarmists and to convince voters that they are the one to save us all. If this sounds preposterous, then read the book, or watch the dvd, and your opinion will change! http://video.google.fr/videoplay?docid=-4123082535546754758
On a geological scale, the 19 th and 20 th centuries saw some relatively mild warming following nearly 400 years of mild cooling. The “extreme” nature of these temperatures changes is a matter of human perception. Climate change is the norm, and these recent changes are not extraordinary.
The Earth has warmed over the last two centuries, apparently by about 1.0 o C or 1.8 o F. However the accuracy of this value is questionable when one considers the sparseness and quality of historical temperature data, the bias in modern instrumental measurements affected by the urban heat island effect, and poor placement of temperature measuring stations. The global temperature increase has been averaged from data recorded mostly in the developed nations which are significantly contaminated for the reasons just stated. Data from less technologically developed countries is even more suspect, often extending back less than 70 years, and many of these countries are experiencing enhanced ground warming due to “brown cloud” pollution (e.g. China and India). A comprehensive investigation and reassessment of global temperature measurements is needed in order to obtain the true temperature increase which is probably smaller than currently claimed by the IPCC, NOAA, and NASA.
Steve Milloy of junkscience.com sent a questionnaire to 345 US contributors and reviewers of the IPCC’s 2007 report ( http://www.junkscience.com/ByTheJunkman/20071108.html ). The questions were right in line with IPCC reports and were not provocative, but only 54 responded, with reactions generally supporting the IPCC’s position. They gave the following response to three questions. Survey of IPCC Climate Experts DemandDebate.com, November 8, 2007 Question #2. Which best describes the role of manmade CO 2 emissions in climate change? Manmade CO 2 emissions are the principal driver of climate change. 17% ; Manmade CO 2 emissions drive climate change, but other natural and human-related factors are also important. 70% ; Other natural and/or human-related factors drive climate change, but manmade CO 2 emissions are important. 6% ; Other natural and/or human-related factors are the principal drivers of climate change. 6% ; No opinion. 2% Question #5. The climatic impacts of a mean global temperature that is 1-degree Celsius warmer than today are: Undesirable. 48%; Desirable. 4%; Desirable for some, undesirable for others. 39%; Too difficult to assess. 7%; No opinion. 2% Question #6. The ideal global climate is... Warmer than the present. 2%; Cooler than the present. 13%; Occurring today. 17%; There is no such thing as an "ideal“ global climate. 61%; No opinion. 7%.
The Sun has always been the main driver of climate and climate change, and there is strong evidence that the recent warming is largely due to two centuries of increasing solar activity. With an increase in solar heating, there has also likely been a decrease in the cooling effects of clouds due to the known reduction in cosmic rays that effect cloud formation. Cosmic rays have been reduced by an increased solar wind and magnetic field.
Warming, according to models emphasizing CO 2 , has not appeared where it is expected to be most pronounced, in the lower and middle troposphere. This seriously questions model assumptions concerning the sensitivity of the atmosphere to increases in CO 2 . For ten years, the temperature has been relatively constant while CO 2 levels increased dramatically.
CO 2 levels will continue to rise, and may even accelerate, due mainly to the annual increases in emissions from China (9% per year), but also from India and other developing nations. In 50 years or less, CO 2 levels will have doubled from the historical average, and planned emission reductions by the US, Canada,and the EU will be dwarfed by increases in China and Asia.
The next 15-20 years of monitoring solar activity, cosmic rays, and low cloud cover, should allow climate models to be tuned to reflect the contributions to global warming from the Sun versus CO 2 .
All the controversies involving climate change discussed in this presentation, in no way, argues in favor of continuing the world’s almost unchecked use of fossil fuels. Whether CO 2 is causing warming has yet to be established, but the detrimental effects of global pollution from the use of hydrocarbon fuels and biomass (wood and vegetation) burning is undeniable.
Carbon soot and the chemicals emitted by fossil fuel and biomass burning are pollutants with deleterious effects on human, plant, and animal health, both on land and in the oceans. The World Health Organization (WHO) estimates that around 4.5 million people die each year from hydrocarbon emissions, mainly from lung and hearth disease, especially in the poor underdeveloped nations. Soot is a much bigger health problem than CO 2 .
Efforts to reduce carbon dioxide emissions are likely to be ineffective towards reducing global warming. If the world stopped emitting today, it would take a couple of centuries for CO 2 to return to pre-industrial levels.
But no scientist can credibly argue against the urgent need to reduce hydrocarbon emissions. China and India, the two largest contributors to increasing emissions, will not likely take significant steps to reduce them until it becomes a serious public health problem (it already is). Unfortunately, it will have to get much worse before action is taken, which won’t be long based on projections of future fossil fuel consumption.
One thing that will not happen is a “runaway greenhouse”. The physics of
greenhouse gases tells us that only with an exponential rise in CO 2 would
we get a constantly increasing or “runaway” warming (observed increases
are approximately linear). Several models which assume observed or weaker
increases in CO 2 predict a temperature trend “damping” towards a finite
amount of heating as shown above. But these same models already over
predict the amount of currently observed warming in the mid to upper
troposphere. More recent calculations and measurements indicate a much
lower sensitivity of the atmosphere to increasing CO 2 .
U.S. Pollutant Emissions are Declining (Total National Inventories of Criteria Pollutants) http://epa.gov/airtrends/econ-emissions.html Million Tons/Year CO NO x x5 PM 10 x10 SO 2 x5 VOC x5 Lead x500
U.S. Greenhouse Gas Emissions are Continuing to Increase
Electricity generation is largest sector
33% of total CO2 equivalent
mainly from coal combustion
Transportation sector is second largest
27% of total CO2 equivalent
Dominated by petroleum use
This is older data. As stated previously, in 2006, the total of China’s CO2 emissions from fossil fuels increased by 9%. In the USA in 2006, emissions decreased by 1.4%, compared to 2005. US CO2 emissions continue to decline while those of the EU (signatories to the Kyoto Protocol) are essentially flat.
In case you didn’t catch this press release in the US media. China now no. 1 in CO 2 emissions; USA in second position Press release; 19 June 2007 Netherland Environmental Assessment Agency China’s 2006 CO 2 emissions surpassed those of the USA by 8%. This includes CO 2 emissions from industrial processes (cement production). With this China tops the list of CO 2 emitting countries for the first time. In 2005, CO 2 emissions from China were still 2% below those of the USA. These figures are based on a preliminary estimate by the Netherlands Environmental Assessment Agency (MNP), using recently published BP (British Petroleum) energy data and cement production data. China, USA & EU In 2006, the total of China’s CO 2 emissions from fossil fuels increased by 9%. In the USA in 2006, emissions decreased by 1.4% , compared to 2005. In the European Union countries (the ‘EU 15’) in that same year, CO 2 emissions from fossil fuels remained more or less constant; in 2005 there was a decrease by 0.8%, according to a recent report by the EEA compiling data from the member states. The European Union is the third-largest carbon dioxide emitter, but its levels are about half of China's. The next biggest emitters are Russia, India and Japan. The lack of change in the EU emissions shows that they are not meeting their Kyoto Protocol targets. Except for the underdeveloped countries, most of the signatory countries will not meet their goals by 2012 as set by the treaty, except by purchasing carbon emission credits from other countries whose emissions basically haven’t changed, so no real change in CO 2 emissions.
EU INDUSTRY CARBON EMISSIONS FLAT IN 2007 04.4.08 - Leído 55 veces. Enviar esta nota Michael Szabo European Union industry emissions were roughly flat in 2007, preliminary EU executive Commission data showed on Wednesday, with low gas prices and a mild winter slowing growth LONDON, UK; April 4, 2008. - As expected, emissions were less than industry’s quotas of permits to emit the greenhouse gas carbon dioxide (CO2) under an EU climate change scheme meant to drive emissions cuts through permit shortages. Brussels has addressed that flaw and a resulting carbon price collapse by cutting permit quotas in the second phase of its emissions trading scheme from 2008. The carbon market is supposed to put a price on carbon emissions, and therefore energy use, and so force businesses and individuals to trim their contribution to climate change for example by being more energy efficient. The Commission cut the allocation of carbon emissions permits, called EU allowances (EUAs), by about 9 percent for 2008-12, and the fact emissions were roughly unchanged last year has not undermined expectations of an EUA shortage in 2008. “ I expect a shortage this year simply because allocations have drastically gone down,” said Fortis analyst Kris Voorspools, adding an estimate that EU industry emissions in the six largest countries rose some 1.2 percent last year. New Carbon Finance estimated that emissions fell 0.25 percent.
There is a lot of hype coming from misinformed politicians, environmental- ists, and the media today suggesting that the world can clean up pollution and meet its energy demands with “renewable” energy sources (biomass) or “alternative” sources like solar panels and wind mills, and “green” technologies. This is another myth being sold to the public by those who either have an agenda or just don’t understand the numbers. In response to these claims, politicians, and more importantly, agribusiness lobbyists, are lining up and putting on their “green” hats before enacting legislation to solve a problem they don’t understand with measures that will likely have little or no impact on climate change. Restricting the use of cheap fossil fuels will deny poor countries the resources they need to develop, and will degrade the economies of developed nations. While the rich industrialized countries of the world can afford to experiment with “boutique” sources of energy like solar and wind power, much of the rest of the world is in survival mode and will never be able to develop if their sources of energy are restricted to solar and wind energy. Famine, political instability, and the death and violence that results from them is a much bigger concern than global warming in poor countries.
In January 2007, President Bush endorsed ethanol as a “green” renewable energy source and as a solution to the US dependence on foreign oil. The price of corn world wide immediately jumped, making this food source more expensive for the poor countries that depend on importing corn and other grains to feed their populations. Food prices in the US also went up during 2007 because of the many uses of corn products, from sweeteners to cattle feed. Corn is too important as a food source to be used to make alcohol, and biologists have yet to find bacteria that are efficient at digesting cruder plant materials (e.g. switch grass, waste wood, “scrub brush”, etc.) Then the atmospheric chemists chimed in an said “bad idea”. Biofuels, namely ethanol, while “renewable”, are not pollution free. They may represent a recycling of carbon from the atmosphere, and a reduction of CO 2 emissions, but ethanol combustion leads to ozone generation in the lower atmosphere. Ozone in the stratosphere is beneficial, but ozone at the ground is destructive to plants, animals, plastics, and people. The World Health Organization predicts ethanol fuels will cause hundreds of thousands of additional annual pollution deaths. Additionally, it currently takes about 120 gallons of fossil fuel to produce 100 gallons of corn ethanol, i.e. growing corn is greenhouse gas intensive. Bad idea!
Please take a step back, and stay cool for a moment. Shut your ears off from the deafening cries and shouts of the global warming advocates, and sharpen your mind. Try to collect the questions and explanations raised in this presentation, and be open for good reasoning: these pages might make you think again, and wonder, if the motives of all these climate catastrophers are pure .... “ Every great mistake has a halfway moment, a split second when it can be recalled and perhaps remedied.” Pearl Buck Al Gore's new climate change initiative. The three-year, $300 million campaign, set to launch Wednesday, is aimed at mobilizing Americans to push for aggressive reductions in greenhouse gas emissions, a move that ranks as one of the most ambitious and costly public advocacy campaigns in U.S. history. Since CO2 is not driving climate change, it will be an unprecedented waste of money. Every citizen should read the Congressional Research Service report to see what the informed opinion is concerning the Obama Administration’s climate change and energy policies, and more importantly there likeliness (or unlikeliness) of success. The CRS is the professional, nonpartisan, research arm of Congress. When Congress suggests a bill or policy change, the CRS does the research on the actual costs and impacts of the bill, and then delivers a report to Congress. The President and the majority in Congress are ignoring the CRS’s report.
Some argue that because there is a possible risk from CO 2 , in spite of what the data is telling us, we should take stringent measures just in case, the “ Precautionary Principle”. Precaution requires justification, not speculation.
By most sensible assessments, solar and wind power can at best provide about 15% of global energy demands, and then only with enormous investment in these technologies. You can power some low level electronics, or charge some batteries for later low power consumption using home solar panels (on sunny days) and wind mills (when the wind blows). You cannot power a city of any size, or run heavy industry or mass transportation off of these relatively weak energy sources unless facilities are built on an enormous scale. Also, burning biofuels represents only a slight reduction in CO 2 emissions (via uptake by plants) that is utterly negligible when compared to fossil fuel emissions by China alone, not to mention up and coming emitters like India. James Hansen, a noted climate change alarmist and this years winner of the Desert Research Institute’s Nevada Medal Award, appeared on CNN’s “Planet in Peril” series in December 2007 with Patrick Michaels (author of “Meltdown”). During a panel discussion of climate change issues, James Hansen used the buzzword of “alternative” energy sources as a solution to greenhouse gas pollution and climate change. Patrick Michaels pushed him to be specific on the issue, about exactly what alternative energy sources he was proposing that would meet the worlds energy demands and clean up the environment. After some foot shuffling, and repeated pressure from Michaels, Hansen finally said “nuclear energy”. A cofounder of Greenpeace, Patrick Moore, agrees.
Real Solutions: Nuclear Energy Nuclear power plants produce over 9.1 times more energy with less than one third (30%) the number of facilities. However, “nameplate capacity” is the maximum rated capacity. Wind mills generally deliver 1/3 of their nameplate capacity due to service and maintenance requirements. Nuclear power plants, originally designed to run at 50% capacity, typically run at around 90% capacity due to improvements in design and efficiency.
According to 2006 data, Las Vegas consumes 23,003,806 MWhrs (Megawatt-hours) of energy each year which requires an energy production rate of 2,626 MW of power. According to a Nevada PBS program on Las Vegas power consumption, Las Vegas gets 85% of its power from coal burning power plants throughout the West, and not from Hoover Dam as many Las Vegas residents mistakenly believe. A large coal fired power plant produces 500 MW of power, but many produce less. Nevada residents are currently funding the construction of a “large” solar power plant in Las Vegas that has a nameplate rating of 47 MW that will provide only 1.7% of the electricity needs of the city, while the population outside of Las Vegas also continues to grow. The average nuclear power plant in the US produces 864 MW of power. So, approximately 3 nuclear power plants could provide Las Vegas with nearly all of its power needs, and a couple more would meet the needs of the entire state and then some, and it would be greenhouse gas free (except for the small amount released during construction with concrete). And they could be built in one of the best places in the world to put them, the Nevada Test Site. India recently announced it will be building nuclear power farms in clusters of 8 reactors of uniform design and output, the sensible way to implement nuclear energy for the future.
From “The Week”, Feb. 20, 2009. Denmark has the largest number of wind power generators than any other country in Europe, yet they have not been able to shut down one of their coal fired power plants. They are planning to convert their power plants from coal to biomass (wood and grass) or a mix of coal and biomass fuels. This will reduce CO 2 emissions but not significantly In terms of Danish or global emission levels, and will in no way “turn electricity into an entirely clean product”. http://www.climatebiz.com/news/2009/02/12/vattenfall-convert-danish-coal-fired-plants-biomass
Newsmax, Nov. 2007 $300 million to build a 1.5 square mile solar panel farm near a city of 70,000 that will deliver electricity at 42 cents per kilowatt hour, compared with wind at 11 cents and nuclear at 6.4 cents per kilowatt hour. With our economy in shambles, the Obama Administration and Congress plan to invest heavily and expensively in solar and wind power when nuclear power, which would require a similar level of initial investment, would produce safe, greenhouse gas free, reliable, cheap energy at a production level with which wind and solar cant even begin to compare. Energy density, the amount of energy produced per kilogram of fuel (or solar panel or wind mill), and energy efficiency should be the guiding principles when investing in future energy sources.
Now's the Time Patrick Moore Chair and Chief Scientist, Greenspirit Strategies Ltd. When I helped found Greenpeace in 1971, my colleagues and I were firmly opposed to nuclear energy. But times have changed. Nuclear energy is the only non-greenhouse gas-emitting power source that can effectively replace fossil fuels and satisfy growing demand. Nuclear energy is affordable . The average cost of producing nuclear energy in the United States is less than 2 cents per kilowatt-hour, comparable to coal and hydroelectric. Nuclear energy is safe . In 1979, a partial reactor core meltdown at Three Mile Island frightened the country. No one noticed that Three Mile Island was a success; the containment structure prevented radiation from escaping and there was no injury among the public or workers. Spent nuclear fuel is not waste . Recycling spent fuel, which still contains 95 percent of its original energy, will greatly reduce the need for treatment and disposal. Nuclear power plants are not vulnerable to terrorist attack . The 5-ft.-thick reinforced concrete containment vessel protects contents from the outside as well as the inside. Nuclear weapons are no longer inextricably linked to power plants . Centrifuge technology now allows nations to produce weapons-grade plutonium without a reactor. Iran's nuclear weapons threat, for instance, is distinct from peaceful nuclear energy. Nuclear reactors offer a practical path to the hydrogen economy . Excess heat from the plants, instead of fossil fuels, can be used for electrolysis. It also can address the increasing shortage of fresh water through desalinization. Together with a combination of solar, wind, geothermal and hydroelectric sources, nuclear energy can play a key role in producing safe, clean, reliable baseload electricity. http://www.icelandreview.com/features/politics%5Fand%5Fbusiness/?ew_news_onlyarea=&ew_news_onlyposition=0&cat_id=21123&ew_0_a_id=244252
The dangers of nuclear energy are grossly exaggerated: its safe, robust, emits no CO 2 or soot, the spent fuel can be recycled, and the leftover waste can be safely stored.
Its time for the public to educate themselves about the advances in nuclear technology and the safe, robust, energy supply that nuclear power offers to both developed and developing nations. A single 1000 megawatt nuclear power plant would annually produce the same amount of energy as burning 3,500,000 tons (tons not pounds) of coal . The waste can be handled safely, contrary to the hysterical warnings from the largely uninformed or knee-jerk “no nukes” crowd. And they emit no greenhouse gases. New reactor designs are meltdown-proof (“Generation IV” reactors: see link below and following slides). South Africa, with little coal and oil resources, has contracted a German company to build two of these new state-of-the-art nuclear power plants. Booming nations like China, are building nuclear reactors as fast as they can. Only the US and Europe (excluding France) are dragging their feet (or backtracking) on developing nuclear power in a sensible and serious manner. Our descendents will look back at us in amazement and wonder why we so ignorantly passed up the obvious solution that was right in front of us.
The Next Atomic Age America’s aging nuclear power plants will have to be replaced, but with what? Leaning over the rail of the metal catwalk, I peer down through 16 ft. of crystal-clear water at the cool, blue glow coming from the shapes at the bottom: partially spent uranium fuel rods. "Blue," says Joel Duling, my guide to America's most sophisticated nuclear test reactor, "not green like on The Simpsons." The narrow canal snakes under the catwalk and makes a dogleg through an opening in the wall into the reactor area, a cavernous room that feels like a jet hangar. The top of the Advanced Test Reactor (ATR) pokes unobtrusively above the concrete floor. Most of the 35-ft.-high steel cylinder housing the reactor core lies underground. The chain reaction occurring there produces 250 megawatts--enough to power 201,000 homes. But, the ATR does something more important than generate energy. The machine tests fuels and alloys against the extreme conditions expected in exotic new reactors--radical designs that could produce power in molten salt (see Hyperion design), snap together like LEGOs and operate without water, safely and affordably fulfilling the decades-old dream of clean, abundant nuclear power. http://www.popularmechanics.com/science/research/3760347.html
How it works: Generation II and III Reactors All 103 nuclear power plants now operating in the United States employ light-water reactors, which use ordinary water as both a moderator and a coolant. The next wave of nuclear plants has taken these Generation II concepts to the next level, improving both safety and efficiency. Utilities plan to begin building Generation III reactors by the end of the decade. In a Gen II Pressurized Water Reactor, water circulates through the core  where it is heated by the fuel's chain reaction. The hot water is then piped to a steam generator, and the steam spins a turbine  that produces electricity. The Gen III Evolutionary Pressurized Reactor improves upon this design primarily by enhancing safety features. Two separate 51-in.-thick concrete walls  , the inner one lined with metal, are each strong enough to withstand the impact of a heavy commercial airplane. The reactor vessel sits on a 20-ft. slab of concrete with a leaktight “core catcher,”  where the molten core would collect and cool in the event of a meltdown. There are also four safeguard buildings  with independent pressurizers and steam generators, each capable of providing emergency cooling of the reactor core.
POWER BALL Uranium in graphite “pebbles” (above). The pebbles are fireproof and almost impossible to use for weapons production . The spent fuel is easy to transport and store, though there still remains the long-term problem of where to store it. And the design of the nuclear reactor is inherently meltdown-proof . If the fuel gets too hot, it begins absorbing neutrons, shutting down the chain reaction. In 2004, the cooling gas and secondary safety controls were shut off at an experimental pebble-bed reactor in China--and no calamity followed , says MIT professor Andrew Kadak, who witnessed the test. How it works: Generation IV Reactors Fourth-generation nuclear power plants differ radically from current reactors by replacing water coolants and moderators, reaching higher temperatures, and gaining the potential to create hydrogen, as well as electricity. One of the six Gen IV designs under consideration is the meltdown-proof pebble-bed reactor, which uses grains of uranium encased in balls of graphite as fuel. Helium gas is heated as it circulates through a vessel of these pebbles  and then powers a turbine  to generate electricity. A heat exchanger  can transfer heat from the helium to adjacent facilities  for the production of hydrogen. The plant relies on “passive safety”: If the cooling system fails, the nuclear reaction grinds to a halt on its own.
China China has licensed the German technology and is actively developing a pebble bed reactor for power generation  . The 10 megawatt prototype is called the HTR-10 . It is a conventional helium-cooled, helium-turbine design. The program is at Tsinghua University in Beijing . The first 200 megawatt production plant is planned for 2007. There are firm plans for thirty such plants by 2020 (6 gigawatts). By 2050, China plans to deploy as much as 300 gigawatts of reactors of which PBMRs will be a major component. If PBMRs are successful, there may be a substantial number of reactors deployed. This may be the largest planned nuclear power deployment in history. Tsinghua's program for Nuclear and New Energy technology also plans in 2006 to begin developing a system to use the high temperature gas of a pebble bed reactor to crack steam to produce hydrogen. The hydrogen could serve as fuel for hydrogen vehicles , reducing China's dependence on imported oil. Hydrogen can also be stored, and distribution by pipelines may be more efficient than conventional power lines.
104 nuclear power plants currently provide electricity for 20% of the nation. One of these reactors (1,000 Megawatt) provides electricity for about one million homes. This is great, but good things can come in small packages, too. For instance, Hyperion Power Generation, Inc . is looking to commercialize small, nuclear reactors for remote locations as soon as 2013. The reactors, developed at the Las Alamos National Laboratory, one of the nation’s leading nuclear laboratories, are the size of about the size of a hot tub and buried under ground. According to the company offering the reactor, it is impossible for them to melt down or for the nuclear material to be diverted for weapons purposes. Furthermore, the amount of nuclear waste one of these reactors produces after about 5 years is about the size of a softball and could be recycled and used again. Hyperion’s plants are smaller than a garden shed and able to power 20,000 homes will be on sale within five years, say scientists at Los Alamos.” And they’re not the only small nuclear reactor game in town, either. Toshiba has been working on a 20 feet by 6 feet reactor called the 4S that would produce electricity at about half the price of regular grid electricity. They could become commercially viable in Japan very soon, and Toshiba hopes to expand to Europe and the U.S. within the next few years. Indeed, the town of Galena, Alaska has passed a resolution calling for the deployment of a 4S to bring affordable power to their remote location. A company called NuScale Power, Inc . is also commercializing a modular, 45MW nuclear power plant. And there are others as well. http://www.hyperionpowergeneration.com/
An article in the Reno Gazette Journal from a few years ago.
Core debate By Bruce Van Dyke This article was published on 10.02.08 . In the Reno News and Review Over the years, this column has expressed opinions that could be labeled as “anti-nuke.” But my beef has never been with nuclear energy per se , but with the ferocious waste it creates. In her book Power to Save the World: The Truth About Nuclear Energy , Gwyneth Cravens presents updated information concerning nuclear waste, info with real potential to be a deal-changer. She describes a new kind of reactor, the integral fast reactor, or I.F.R. The one functional I.F.R. in existence here in the United States (now without funding) is at the Idaho National Laboratory near the town of Arco, and it’s obvious there have been some serious improvements made in the atom-smashing game. “The I.F.R. efficiently recycles fuel, in the process reducing long-lived reactor waste by 99 percent, and is inherently meltdown-proof. Fast reactors could create enough fuel to power the entire country for more than 500 years while rendering weapons-grade nuclear material into fuel and rejuvenating spent fuel onsite.” Folks, we ain’t talkin’ about Chernobyl here (and it’s absurd to use that disaster as a reason to be against nukes. Chernobyl was to modern nuclear technology what a ’74 Pinto is to the Indy 500). The I.F.R., capable of re-burning and re-burning spent fuel, thereby greatly reducing the final amount of nuclear waste that ultimately must be disposed, represents major progress. But, in the end, you still have some radioactive trash to deal with. With the I.F.R., you’ve done something one would normally think impossible. You’ve made plutonium palatable. Cravens: “You burn up the plutonium and make waste that has only about a 400 year half-life, as opposed to waste that has to be isolated for 10,000 years.” Because if the waste can be processed so it’s no more dangerous than raw uranium in 400 years, it would now seem to be much more in our technological comfort zone. When the assignment is safe storage for 10,000 years, well, that’s a little dicey, to put it mildly. Four hundred years, though, sounds downright doable. Would we want to do it at Yucca Mountain? Probably not. Us Nevadans have just become so bitchy about this whole nuke waste thing. We probably can’t even be bought off at this point, although a billion bucks a year from the feds might smell pretty good right about now to our state treasurer. There’s an alternative. A nuclear waste facility that’s been working efficiently, safely, and smoothly near Carlsbad, N.M., called the Waste Isolation Pilot Plant. WIPP was originally built to store only the nuke waste from military sites. One of WIPP’s designers says, “From just a technical point of view, the best place on dry land to store all nuclear waste—wherever it comes from—is at WIPP. Geologically and hydrologically, it’s the safest.”
What the International Atomic Energy Agency (IAEA) has to say: http://www.iaea.org/Publications/Booklets/Development/devnine.html NUCLEAR POWER ADVANTAGES: Limited Environmental Impacts Energy density comparisons (fuel requirements): The quantity of fuel used to produce a given amount of energy - the energy density - determines in a large measure the magnitude of environmental impacts as it influences the fuel extraction activities, transport requirements, and the quantities of environmental releases and waste. The extraordinary high energy density of nuclear fuel relative to fossil fuels is an advantageous physical characteristic. One kilogram (kg) of firewood can generate 1 kilowatt-hour (kW·h) of electricity. The values for the other solid fossil fuels and for nuclear power are: 1 kg coal: 3 kW·h 1 kg oil: 4 kW·h 1 kg uranium: 50,000 kW·h ( 3,500,000 kW·h with reprocessing!) A 1000 MegaWatt (MW) plant requires the following number of tonnes (t = metric ton = 2200 lb) of fuel annually: 2,600,000 t coal: 2000 train cars (1300 t each) 2,000,000 t oil: 10 supertankers 30 t uranium: reactor core (10 cubic meters or about 10 cubic yards)
The energy density of fossil and of nuclear fuel allows relatively small power plant areas of some several square kilometers (km²). The low energy density of renewables , measured by land requirements per unit of energy produced, is demonstrated by the large land areas required for a 1000 MegaWatt (MW) system with values determined by local requirements and climate conditions ( solar and wind availability, operation, and maintenance factors range from 20 to 40%, i.e. a solar or wind power plant rated at 1000 MW actually delivers 200-400 MW ):
Fossil and nuclear sites: 1–4 km²
Solar thermal or photovoltaic 20–50 km²
(PV) parks: (a small city)
Wind fields: 50–150 km² (a medium sized city)
Biomass plantations: 4000–6000 km² (a province)
Global energy demand is projected to increase by around 40% by 2030, especially in Asia. Solar and wind power, even with heavy investment and advances in technology, can at best provide a minor portion of this. Their energy density is simply too low compared with coal, gas and oil, but especially with nuclear power.
Read what else the report has to say at, http://www.iaea.org/Publications/Booklets/Development/devnine.html
Great advances have been made and are continuing to be made in the field of nuclear fuel technology and reprocessing, resulting in significant reductions in the amount of currently non-reusable nuclear waste that must be stored until methods are developed for dealing with it. e.g. http://www.unr.edu/nevadanews/templates/details.aspx?articleid=4699&zoneid=40 . Unfortunately, misinformation and misconceptions concerning nuclear waste are prevalent. Here’s what the IAEA says about plutonium. A misconception Although ongoing exposure to fossil fuel related toxic pollutants through polluted air and contaminated water and food is a daily experience, there is a widely held public belief that nuclear power presents the greater health risk. Extreme concerns about radiation are demonstrated by a common conviction that plutonium - in spent fuel and from reprocessing - can be significantly more harmful than toxic pollutants, with some people believing it is the most hazardous substance on earth. Plutonium is not very radioactive - as a long lived material with a half-life of more than 24 000 years it decays very slowly. Its radiation cannot penetrate even a sheet of paper. As it is not highly soluble in most forms, it is not very hazardous when small quantities are ingested in liquids, where the major portion passes unabsorbed through the body. In fact, plutonium can be extremely hazardous to health only when finely dispersed in sufficient concentration and inhaled, when - as with very small particles of inhaled toxic pollutants - it passes through the lung tissue into the blood. Fortunately, a scenario to disperse sufficient amounts of plutonium, which is transported in strong structural containers, into the atmosphere to cause significant health effects in populations would be extremely difficult. By contrast, many of today's energy related toxic pollutants, including easily inhaled particulates that are the main mortality factor due to fossil fuels, have high potential health effects. http://www.iaea.org/Publications/Booklets/Development/devsix.html
There is usually a knee jerk reaction from the mostly uninformed public over the “dangers” of nuclear energy, as opposed to its benefits. Many environ-mentalists and “anti-nuke” activists, often uninformed themselves, take advantage of this ignorance and perpetuate the fear and hysteria surrounding radiation and its effects on humans. The Radiation Effects Research Foundation (a joint Japan-US scientific organization), the World Health Organization (WHO), and other groups have gathered extensive data on human exposure incidents, especially Hiroshima, Nagasaki, and Chernobyl. The analysis of these worst case incidents has shown that radiation from nuclear power poses little risk to humans, in fact, far less risk than hydro-carbon fuels and biomass burning. There are naturally occurring radioactive elements in coal which are released upon burning; the population effective dose equivalent from coal plants is 100 times that from nuclear plants. Many wild claims are made about the extreme dangers of nuclear power and radiation, but these claims are not supported by the data. To paraphrase an earlier statement, never let the facts get in the way of a good scary story. France gets about 75% of its energy, and Japan, about 30% (41% by 2014) from nuclear power. Radiation Health Effects: Facts versus Public Perception
World War II saw 100 million people die, essentially over the access to and control of oil, coal, and other resources. Since then, according to the WHO, several million have died every year, at an accelerating rate, from pollution generated by hydrocarbon fuels, and not due to carbon dioxide, but from soot and other chemicals which cause lung and heart disease. The cumulative deaths from radiation since the inception of nuclear energy and weapons is truly negligible in comparison, approximately 50,000 or less, over 60 years. Contrary to what is often reported on supposedly legitimate TV science and history programs, the victims at Hiroshima and Nagasaki were not “vaporized in an instant”. Most of the approximately 105,000 deaths in Hiroshima and Nagasaki were from blunt force trauma from the blast and from burns caused by heat, but mostly from the fire storms ignited by the bombs, not from nuclear radiation which accounted for about 15% of fatalities. So, a few million deaths per year are caused by hydrocarbon pollution, versus 800 or so per year from radiation, but this number is overstated because it largely stems from the two bomb events . Contrary to popular myths and public perception, there were very low residual effects from these radiation exposure events. Life on Earth has always been bombarded by energetic radiation from space, and one of the wonders of DNA is its ability to repair itself. If it didn’t have this quality, you probably wouldn’t be reading this. Radiation Health Effects: Facts versus Public Perception
Radiation Health Effects: Facts versus Public Perception If you dig into the websites at the end of this section, you will find out that statistics were gathered on about 50,000 individuals who were irradiated at Hiroshima and Nagasaki in 1945. Of those individuals, about 4700 died of cancer, and out of those cancer deaths, when compared with the average cancer death rates in cities not irradiated, it was found that radiation was responsible for about 1-2% of those cancer deaths among men, and 2-3% among women. The vast majority of cancer deaths resulted from naturally occurring cancers, industrial pollution, and life style (smoking, drinking, etc.). Japan was heavily polluted due to post-WWII reconstruction and industrialization, and it is difficult to separate cancer deaths from radiation when there are so many other more potent sources. This is also the case with genetic damage and birth defects which are also known to be linked to pollution. By all rational assessments, radiation appears to have been a very minor contributor to cancer deaths and birth defects when compared with pollution and the natural occurrence rates, and this from a worst case scenario, the use of a nuclear bomb.
Radiation Health Effects: Facts versus Public Perception Nuclear power plant accidents release far less radiation than nuclear weapons, and as previously noted, coal fired power plants release far more radiation than nuclear power plants. Japanese bomb victims were exposed to between 0.2 and a few “ Sieverts ” (Sv). The yearly average natural background dose is about 0.0024 Sv, and doses up to 0.01 Sv over a few hours are considered to be completely safe (no detectable effects or changes), though the nuclear power industry sets much lower limits for workers. The US Nuclear Regulating Commission has set limits of 0.005 Sv per year (about 2x the natural background) for declared pregnant workers who are considered most susceptible to radiation effects. Some claim that radiation is unsafe at any dose, but this ignores the fact that we are constantly exposed to natural sources. Natural background radiation comes from two primary sources: cosmic radiation from space (stars), and terrestrial sources. An average cubic yard of soil contains about 5 grams of uranium (about a nickel coin weight), while organic and inorganic material contains two naturally occurring isotopes, radioactive carbon-14 and potassium-40. Older coal-fired power plants without effective fly ash capture are one of the largest sources of human-caused background radiation exposure.
Radiation Health Effects: Facts versus Public Perception Nuclear reactors are incapable of exploding like nuclear bombs. There have been two major reactor incidents in the history of civil nuclear power, Three Mile Island and Chernobyl. One was contained without harm to anyone, and the other involved intense fire without containment (all US and Western reactors have an outer containment vessels that surrounds the main reactor, unlike the antiquated Soviet design used at Chernobyl). These are the only major incidents to have occurred in more than 12,700 cumulative reactor-years of commercial operation In 32 countries. With respect to Chernobyl, the WHO assessed cancer deaths at much lower rates than those in Japan from the nuclear bombs. For the roughly 600,000 people exposed to the highest doses by the Chernobyl event, about 4000 are expected to die of cancer (less than 1%). The additional 6.8 million people exposed received, on average, about 0.007 Sievert, about the yearly allowable dose for nuclear power industry workers that is considered totally safe. Globally, cancers account for about 13% of all deaths, varying considerably by country, degree of economic development, life style, health care system, and other factors.
Radiation Health Effects: Facts versus Public Perception There has been no measurable increase in cancers from the Three Mile Island event, the lone U.S. nuclear accident resulting in a release of radiation in over 50 years of the history of the nuclear power industry. While some scientists and health officials claim that a tiny increase in cancers downwind of TMI were detected over the following decades, these are extremely difficult to separate from the majority of cancers caused by local air, ground, and water pollution, and other causes. Around 4500 coal miners die every year in mining accidents (about 12 a day in China alone), and this doesn’t include deaths from respiratory and heart disease caused by “black lung disease”, i.e. breathing coal dust. According to the WHO, over 4 million people die each year from air pollution (separate from ground and water pollution), mostly from fossil fuel and industrial emissions, a number projected to increase if ethanol fuel substitutes are used (ethanol combustion results in ozone formation at the ground which is more toxic than current auto emissions). About 40 percent of deaths worldwide are caused either directly or indirectly by water, air and soil pollution, concludes a Cornell researcher. http://www.sciencedaily.com/releases/2007/08/070813162438.htm
Radiation Health Effects: Facts versus Public Perception While the statistics of death are not pleasant to consider, it is part of the science of risk management and analysis that is taken into account when weighing the cost versus benefit of things like energy production, food production, pharmaceuticals, and many other issues important to society. Discussions of future or alternative energy sources will require the assessment of all costs and benefits, including pollution generated during the manufacture, construction, operation, and disposal of such sources. Data and knowledge always trumps myths, rumors, and opinions. While the present era can be referred to as the “Age of Information”, it is also the age of misinformation, with the internet awash in personal opinion, conspiracy theories, and simply erroneous “information”. However, if you search in depth, you can find legitimate authoritative sources with links to other sites. Below are just a few that address radiation effects. http://www.rerf.or.jp/index_e.html (Hiroshima, Nagasaki) http://www.nature.com/nature/journal/v440/n7087/full/440982a.html (Chernobyl) http://www.worldwideschool.org/library/books/hst/northamerican/TheAtomicBombingsofHiroshimaandNagasaki/chap11.html (Hiroshima, Nagasaki)
Climate Change Websites This PowerPoint is data intensive. A good introduction for those who have been frightened by An Inconvenient Truth is " The Great Global Warming Swindle " produced by Channel 4 in the UK, the science based rebuttal to Al Gore's non-scientific propaganda film, as ruled by a judge in Britain, http://scottthong.wordpress.com/2007/10/10/official-british-court-finds-11-inaccuracies-in-al-gores-an-inconvenient-truth-labels-it-as-political-propaganda/ "The Great Global Warming Swindle" has extra features that go to the heart of the real science behind climate change and much of the nonsense behind the IPCC's alarmist reports and Al Gore's utter distortion of the known science. The video may be downloaded and viewed at http://motls.blogspot.com/2007/03/great-global-warming-swindle.html If you are interested in politics and policy, read " Meltdown: The Predictable Distortion of Global Warming by Scientists, Politicians, and the Media " by Patrick Michaels. websites; http://www.worldclimatereport.com/ (anti-alarmist) http://www.co2science.org/ (anti-alarmist) http://www.realclimate.org/ (alarmist) http://www.ipcc.ch/ (alarmist) http://www.junkscience.com/ (anti-alarmist) http://www.canada.com/nationalpost/news/story.html?id=c6a32614-f906-4597-993d-f181196a6d71 (anti-alarmist) http://climatesci.colorado.edu/ (anti-alarmist) See Categories and Archives http://www.grist.org/news/maindish/2006/12/06/ADM/index.html (the blessings of biofuel?)