Wednesday, September 29, 2010

A paper published today in the Journal of Climate finds that in "one of the most variable [disrupted?] climates of the world" - the Northeastern US - "Although maximum temperatures indices showed strong warming trends over the period 1893 – 1950, subsequent trends show little change and cooling."

Climate System Research Center, Department of Geosciences, University of Massachusetts, Amherst, Massachusetts

Abstract: The northeastern U.S. is one of the most variable climates in the world, and how climate extremes are changing is critical to populations, industries and the environment in this region. A long-term (1870 – 2005) temperature and precipitation dataset was compiled for the northeast U.S. to assess how the climate has changed. Adjustments were made to daily temperatures to account for changes in mean, variance and skewness resulting from inhomogeneities, but precipitation data were not adjusted. Trends in 17 temperature and 10 precipitation indices at 40 stations were evaluated over three time periods: 1893 – 2005, 1893 – 1950 and 1951 – 2005, and over 1870 – 2005 for a subset of longer term stations. Temperature indices indicate strong warming with increases in the frequency of warm events (e.g. warm nights and warm summer days) and decreases in the frequency of cold events (e.g. ice days, frost days and the cold spell duration indicator). The strongest warming is exhibited in the decrease in frost days and the increase in growing season length. Although maximum temperatures indices showed strong warming trends over the period 1893 – 1950, subsequent trends show little change and cooling. Few significant trends were present in the precipitation indices; however, they displayed a tendency toward wetter conditions. A stepwise multiple linear regression analysis indicated some of the variability the 27 indices from 1951 – 2002 wasexplained by the [naturally occuring] North Atlantic Oscillation, Pacific Decadal Oscillation and Pacific North American pattern.However, teleconnection patterns showed little influence on the 27 indices over a 103-year period.

More evidence of natural variability/disruption also published today in the Journal of Climate:

Abstract: Caribbean basin tropical cyclone activity shows significant variability on inter-annual as well as multi-decadal timescales. Comprehensive statistics for Caribbean hurricane activity are tabulated, and then large-scale climate features are examined for their impacts on this activity. The primary inter-annual driver of variability is found to be El Niño–Southern Oscillation, which alters levels of activity due to changes in levels of vertical wind shear as well as through column stability. Much more activity occurs in the Caribbean with La Niña conditions than with El Niño conditions. On the multi-decadal timescale, the Atlantic Multi-decadal Oscillation is shown to play a significant role in Caribbean hurricane activity, likely linked to its close relationship with multi-decadal alterations in the size of the Atlantic Warm Pool and the phase of the Atlantic Meridional Mode. When El Niño – Southern Oscillation and the Atlantic Multi-decadal Oscillation are examined in combination, even stronger relationships are found due to a combination of either favorable or unfavorable dynamic and thermodynamic factors. For example, 29 hurricanes tracked into the Caribbean in the ten strongest La Niña years in a positive Atlantic Multi-decadal Oscillation period compared with only two hurricanes tracking through the Caribbean in the ten strongest El Niño years in a negative Atlantic Multi-decadal Oscillation period.

Tuesday, September 28, 2010

From the "You can only get grant funding & publicity if you link it to climate change department,"comes a press release on a new study noting the 400 million year old "living fossil" horseshoe crab is endangered due to climate change, such as sea levels currently rising 1.85 mm/yr [compared to the average 7 mm/yr since the last ice age] and water temperature increase [actually the oceans have been cooling since 2003]. Oh, and by the way, over-harvesting, destroyed habitats, use as fishing bait and in the pharmaceutical industry might play a minor role.

But first, a couple graphs for a historical perspective on sea temperatures and sea levels covering just a small fraction of the 400 million years horseshoe crabs have been around:

The sea dagger tail [a.k.a. horseshoe crab], which has existed for more than 400 million years, is threatened. Researchers from Gothenburg University can now show how vulnerable populations to climate change. The results of the study recently published in the scientific journal Molecular Ecology. Dagger tails is often regarded as living fossils, which have survived almost unchanged in terms of body design and lifestyle for more than 400 million years. Animals similar to today's dagger tails were on earth long before dinosaurs.

- By examining the genetic variation within populations of dagger tails along the U.S. east coast, we have been able to track changes in population size through time. We noticed a marked decline in the number of dagger tails during the ice age final stage, a period characterized by a significant global warming, "says researcher Matthias Obst of Zoology, Göteborg University and co-authorship of the study published in the journal Molecular Ecology.

- Our results show that the current, already severely shrunken stocks, likely will continue to decline in future climate change.

All four species of dagger tails are endangered due to over-harvesting, the animals use as fishing bait and in the pharmaceutical industry. Destroyed habitats around the beaches that are the animals' spawning grounds also contributes to the decline. Scientists predict now that dagger tails will be further reduced in number by future climate change. "The most decisive factor is the impending changes in sea level and water temperature. These environmental changes are likely to affect the animals' distribution and reproduction very negatively, " says Matthias Obst.

Monday, September 27, 2010

A new "in science we trust" poll is out surveying the online readership of 'Scientific' American and Nature with some eye-opening results. The purpose was to find out whether the "public" still trusts scientists on controversial issues in the wake of climategate and other recent scientific scandals. Even amongst the "highly educated" online readership, many of whom are scientists themselves, the poll results show that trust in scientists to provide accurate information on climate change couldn't make it over the 'trust' hump and were far away from 'highly trust.'

The 'Scientific' American article mentions once the biased nature of the sample and everywhere else in the article equates the results with the general public in violation of statistics 101. The hate speech of the D-word is also utilized:

Numerous polls show a decline in the percentage of Americans who believe humans affect climate, but our survey suggests the nation is not among the worst deniers. (Those are France, Japan and Australia.)

Multiple polls worldwide now show only 30-40% of the public at large believe in human-induced climate change. Thus, 'Scientific' American effectively manages to insult many of its international online readers as 'deniers' and insults the majority of the international public at large as 'deniers.' Why don't they also label anyone who doubts the science on any of the other topics above as 'deniers?'

Dr. Judith Curry, IPCC scientist and Chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, has a highly recommended post at her new blog today, No consensus on consensus. Selected quotes are below:

I think the IPCC consensus approach was valuable in the 1990’s and arguably through the third IPCC Assessment Report, in terms of sorting through and assessing the large amount of scientific research on the topic. The Nobel Peace Prize notwithstanding, questions regarding bias and the corruptibility of the IPCC’s consensus-based assessment process are of substantial concern.

while the IPCC consensus approach has been effective in communicating climate science to policy makers, it has marginalized dissenting voices.

...a complete characterization of uncertainty is more important than consensus. While I understand the policy makers’ desire for a clear message from the scientists, at this point the consensus approach being used by the IPCC doesn’t seem to be up to challenge of an accurate portrayal of the complexities of the problem and the uncertainties.

...a more realistic portrayal of confidence levels would go a long way towards reducing the “noise” and animosity portrayed in the media that fuels the public distrust of climate science and acts to stymie the policy process.

The legal memo framework requires providing evidence for and against, and a full characterization of the uncertainties. Subsequent arguments based on the legal brief model would then stake out a specific positions on policy options and their justification (based upon science, economics, politics, values, etc.), with extensive follow-on cross examination of all briefs that are presented. This framework would broaden the assessment scope and allow for a range of perspectives, providing a better informational basis for decision making on this complex issue.

The formal process of achieving consensus requires serious treatment of the considered opinion of each group member, noting that discussion of opposing views enhances the value of the ultimate consensus.

Sunday, September 26, 2010

A recommended essay by Swedish climatologist Dr. Hans Jelbring offers a high school through advanced level debunking of the so-called 'greenhouse effect.' Dr. Jelbring finds that basic scientific principles demonstrate that global temperatures are not controlled by human emissions of 'greenhouse gases' and the 'greenhouse effect' is explainable using only the physics of pressure, gravity, volume, and the adiabatic lapse rate.

Essay from sourceilovemycarbondioxide.comhas been edited to remove references to Swedish law and politics. Please visit source for more recommended essays

What has politics, a needed instrument to run a nation, to do with a scientific concept that tells the difference between the surface temperature of earth and the temperature of earth’s atmosphere as seen from space? This temperature difference of 33C has unfortunately and inadequately been named “The Greenhouse Effect” (GE) despite the absence of any relationship between this effect and the warm climate in a real greenhouse. The intention of this paper is to cover the title subject in a few pages in a way that is understandable to a high school student and, hence, to Swedish parliamentary members. Basic scientific principles demonstrate that the overall GE phenomenon is not a result of human emissions of “greenhouse gases”.

...The IPCC is biased from the start by its mandate. It only covers the impact on climate caused by man (anthropogenic or AGW) which is reductionism that does not conform to scientific methods. Furthermore, the IPCC has chosen not to investigate those types of local, regional and national global anthropogenic impacts which actually do exist. The IPCC has emphasized the importance of an unverified, simplistic model that predicts a particular surface temperature of earth as being caused by “greenhouse gases”. These are specifically identified as carbon dioxide and methane. Water vapor, the most abundant greenhouse gas, is wrongly assumed to be “a quantifiable feedback” to carbon dioxide, which is 50 times less abundant than water vapour in the atmosphere. Such a model is far too inexact and speculative to describe the complex climate system. This type of logic does not conform to accepted scientific methodology.

Saturday, September 25, 2010

A new paper utilizing "many" records never previously included in reconstructions of past temperatures finds that the Roman & Medieval Warming Period temperatures reached or exceeded those of the latter 20th century. Unfortunately, the paper still resorts to the old Michael Mann tactic of tacking on the instrumental thermometer record at the end for an apples and oranges comparison, but at least they are up-front about it.

Year AD on x axis, temp anomaly on y axis, added red line showing temp at year 2000, red dashed line is the tacked on thermometer record, MWP = Medieval Warming Period, RWP=Roman Warming Period

Abstract: A new temperature reconstruction with decadal resolution, covering the last two millennia, is presented for the extratropical Northern Hemisphere (90–30°N), utilizing many palaeo-temperature proxy records never previously included in any large-scale temperature reconstruction. The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. ad 1–300, reaching up to the 1961–1990 mean temperature level, followed by the Dark Age Cold Period c. ad 300–800. The Medieval Warm Period is seen c. ad 800–1300 and the Little Ice Age is clearly visible c. ad 1300–1900, followed by a rapid temperature increase in the twentieth century [doesn't look to me to be any more rapid than from 700 to 1000 AD]. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961–1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself. Our temperature reconstruction agrees well with the reconstructions by Moberg et al. (2005) and Mann et al. (2008) with regard to the amplitude of the variability as well as the timing of warm and cold periods, except for the period c. ad 300–800, despite significant differences in both data coverage and methodology.

Abstract: The importance of stratospheric ozone depletion on the atmospheric circulation of the troposphere is studied with an atmospheric general circulation model, the Community Atmospheric Model, version 3 (CAM3), for the second half of the 20th century. In particular, the relative importance of ozone depletion is contrasted with that of increased greenhouse gases and accompanying sea surface temperature changes. By specifying ozone and greenhouse gas forcings independently, and performing long, time-slice integrations, it is shown that the impacts of ozone depletion are roughly two to three times larger than those associated with increased greenhouse gases, for the Southern Hemisphere tropospheric summer circulation. The formation of the ozone hole is shown to affect not only the polar tropopause and the latitudinal position of the midlatitude jet: it extends to the entire hemisphere, resulting in a broadening of the Hadley cell and a poleward extension of the subtropical dry zones. The CAM3 results are compared to and found to be in excellent agreement with those of the multi-model means of the recent Coupled Model Intercomparison Project (CMIP3) and Chemistry-Climate Model Validation (CCMVal2) simulations. This study, therefore, yields a direct attribution of most Southern Hemisphere tropospheric circulation changes, in the second half of the 20th century, to stratospheric ozone depletion.

Thursday, September 23, 2010

From a discussion board at NOAA comes this graph showing the developing La Niña of 2010 might be a record breaker. La Niña causes multiple global climate disruptions including a drop in global temperatures and extreme weather events around the globe, which no doubt will still be conveniently blamed on mankind's evil ways. The last 2 times of record-contending La Ninas in 1954 and 1973 were during the global cooling scare:

You are here (at "10+" data point)

Global temperatures drop about 6 months after a drop in the Pacific Decadal Oscillation (PDO), which is similar to but not the same as the El Niño-La Niña Index, as shown in the graph below. Look out for an extra cold winter and some cold water splashed on the claim "2010 is the hottest year ever."

Wednesday, September 22, 2010

A peer-reviewed paper published in the Canadian Journal of Earth Sciences finds that westernArctic sea ice extent at the end of the 20th century was more extensive than most of the past 9000 years. The paper also finds that the western Arctic sea ice extent was on a declining trend over the past 9000 years, but recovered beginning sometime over the past 1000 years and has been relatively stable and extensive since. The paper also demonstrates that even though western annual sea ice extent has been less than the present throughout most of the last 9000 years, low sea ice has consistently failed to cause a planetary albedo 'tipping point' claimed by warmists.

Although it seems like a day doesn't go by without an alarmist headline or blog posting obsessing over the daily Arctic sea ice statistics (and never about Antarctic sea ice extent which reached a record high this year), this paleo-climate perspective takes all the wind out of alarmist sails. Satellite assessment of sea ice conditions is only available beginning in 1979 (around the time the global cooling scare ended), with only sparse data available prior to 1979. The alarmists at the NRDC fraudulently claim in a new video that due to "climate destruction," Arctic sea ice reached the lowest in history in 2010 (actually the low since 1979 was in 2007 and 2010 was the 3rd or 4th lowest depending on the source). Probably wouldn't bring in many donations if they mentioned the truth: the 21st century has some of the highest annual western Arctic sea ice extents over the past 9000 years.

The figure below comes from the paper, but has been modified with the red notations and rotated clockwise. The number of months the sea ice extent is greater than 50% is shown on the y axis. Time is on the x axis starting over 9000 years ago up to the present. Warming periods are shown in gray with the Roman and Medieval warming periods (RWP/MWP) notated, the Minoan Warming Period about 5000 years ago, and another older unnamed warming period. The last dot on the graph is the end of the 20th century and represents one of the highest annual sea ice extents.

Abstract: Cores from site HLY0501-05 on the Alaskan margin in the eastern Chukchi Sea were analyzed for their geochemical (organic carbon, d13Corg, Corg/N, and CaCO3) and palynological (dinocyst, pollen, and spores) content to document oceanographic changes during the Holocene. The chronology of the cores was established from 210Pb dating of near- surface sediments and 14C dating of bivalve shells. The sediments span the last 9000 years, possibly more, but with a gap between the base of the trigger core and top of the piston core. Sedimentation rates are very high (*156 cm/ka), allowing analyses with a decadal to centennial resolution. The data suggest a shift from a dominantly terrigenous to marine input from the early to late Holocene. Dinocyst assemblages are characterized by relatively high concentrations (600–7200 cysts/cm3) and high species diversity, allowing the use of the modern analogue technique for the reconstruction of sea-ice cover, summer temperature, and salinity. Results indicate a decrease in sea-ice cover and a corresponding, albeit much smaller, increase in summer sea-surface temperature over the past 9000 years. Superimposed on these long-term trends are millennial-scale fluctuations characterized by periods of low sea-ice and high sea-surface temperature and salinity that appear quasi-cyclic with a frequency of about one every 2500–3000 years. The results of this study clearly show that sea-ice cover in the western Arctic Ocean has varied throughout the Holocene. More importantly, there have been times when sea-ice cover was less extensive than at the end of the 20th century.

Arctic summer sea surface temperatures are also currently lower than much of the past 9000 years

note this is version 2.0 of this post updated to repeatedly emphasize this drilling site was located in the western Arctic. see comments below for details.

The hoax moniker du jour changed from 'global warming' to 'climate change' and then to 'climate disruption' only a week ago. Now, according to the NRDC, "scientists have been warning us for years to expect this kind of climate destructionas a result of carbon pollution.”

It's hard to keep up with all the alarmist lies and distortions in this 'piece of work.'

Coming to a planet near you

Sept. 21, 2010 - Summer 2010 has provided vivid and staggeringillustrations of the extreme weather events and dramatic impacts on people’s lives we can expect in the 21st century as global warming continues unabated. The extreme weather fits a pattern long expected by climate scientists as a consequence of climate change and are summarized in a recently released report of temperature data and video of world climate events from the Natural Resources Defense Council (NRDC).

Global warming is dangerously and permanently disrupting our climate. Because the atmosphere can hold more moisture as it warms, there is more rapid evaporation when it is dry and more intense rainfall when it is wet. The result is an increase in severe droughts and floods. As we have seen this year in Russia, Pakistan, China, and the United States, the results can be tragic.

Monsoon-induced floods in Pakistan displaced more than 6 million people and destroyed one million homes. In Russia, the worst heat and drought on record led to the loss of one-third of the wheat crop while rampant wild fires that consumed whole villages. China was besieged by extreme rains leading to devastating mudslides while floods swept through Iowa and Tennessee killing 54 amidst searing, record-setting heat in other parts of the country.

Ten states experienced record-warm summers: Rhode Island, New Jersey, Delaware, Maryland, Virginia, North Carolina, Tennessee, South Carolina, Georgia, and Alabama. Maine, New Hampshire, Vermont, Rhode Island, Connecticut, and New Jersey each had their warmest year-to-date (January-August) period. As a result, U.S. temperature-related energy demand for Summer 2010 was the highest ever putting more pressure on the nation’s energy supply.

“Fall may be here but we should not forget the Summer of 2010 as a harbinger of things to come,” said Dan Lashof, NRDC’s climate center director. “And this should come as no surprise to anyone because scientists have been warning us for years to expect this kind of climate destruction as a result of carbon pollution.”

Tuesday, September 21, 2010

Quite a difference of opinion between Dr. Hubert Lamb and the subsequent director of Alarm Central* - a.k.a. the University of East Anglia Climate Research Unit (HADCRU)- the infamous post-whitewash-reinstated Dr. Phil Jones. Also curious, Dr. Lamb said in 1972 that the global temperature trend had been slowly dipping for the past 20 years. But a plot today of the 'same' HADCRU data shows an increasing trend from 1952-1972:

* in association with NASA/GISSUPDATE: Dr. Lamb was also the source of the paleoclimate graph used in the 1990 & 1995 IPCC reports which shows an inconvenient hotter Medieval Warming Period than the present. The following IPCC report threw away Dr. Lamb's graph in favor of Michael Mann's hockey stick graph, which served to eliminate the Medieval Warming Period. This was the purpose of Mann's hockey stick, as stated in the climategate emails, to eliminate or "contain the MWP" according to Mann.

UPDATE 2: A 1974 newspaper article interview of Dr. Lamb says that the global temperature had dropped by 1/3 to 1/2 of a degree C in the last 30 years, followed by "The decline of prevailing temperatures since about 1945 appears to be the longest-continued downward trend since temperature records began," says Professor Hubert H Lamb of the University of East Anglia in Great Britain. However, a plot today of the 'same' HADCRU data from 1944-1974 also shows an increasing trend:

Climate scientist Roger Pielke Sr has posted today an in-press paper which demonstrates that ocean temperatures flattened in 2001-2002 and have been on a negative trend since. The ocean temperature trend is far more important than the hopelessly adjusted & flawed land temperature record to assess global warming, as noted by Dr. Pielke. During this period, CO2 levels have steadily climbed, which according to the IPCC should have caused a positive radiative imbalance resulting in about .16C warming. The fact that ocean temperatures have instead been cooling falsifies the entire anthropogenic global warming hypothesis.

"There is an excellent new paper by Bob Knox and David Douglas that provides further insight into the issue of the monitoring of global climate system heat changes. The paper is

ABSTRACT: A recently published estimate of Earth’s global warming trend is 0.63 ± 0.28 W/m2, as calculated from ocean heat content anomaly data spanning 1993–2008. This value is not representative of the recent (2003–2008) warming/cooling rate because of a “flattening” that occurred around 2001–2002. Using only 2003–2008 data from Argo floats, we find by four different algorithms that the recent trend ranges from –0.010 to –0.160 W/m2 with a typical error bar of ±0.2 W/m2. These results fail to support the existence of a frequently-cited large positive computed radiative imbalance.

Discussion and Summary

As many authors have noted, knowing FOHC [ocean heat content] is important because of its close relationship to FTOA, the net inward radiative flux at the top of the atmosphere. Wetherald et al. [13] and Hansen et al. [14] believe that this radiative imbalance in Earth’s climate system is positive, amounting recently [14] to approximately 0.9 W/m2. Pielke [15] has pointed out that at least 90% of the variable heat content of Earth resides in the upper ocean. Thus, to a good approximation, FOHC may be employed to infer the magnitude of FTOA, and the positive radiation imbalance should be directly reflected in FOHC (when adjusted for geothermal flux [9]; see Table 1 caption). The principal approximations involved in using this equality, which include the neglect of heat transfers to land masses and those associated with the melting and freezing of ice, estimated to be of the order of 0.04 W/m2 [14], have been discussed by the present authors [9].
In steady state, the state of radiative balance, both quantities FTOA and FOHC should be zero. If FTOA > FOHC, “missing energy” is being produced if no sink other than the ocean can be identified. We note that one recent deep-ocean analysis [16], based on a variety of time periods generally in the 1990s and 2000s, suggests that the deeper ocean contributes on the order of 0.09 W/m2. This is not sufficient to explain the discrepancy.Trenberth and Fasullo (TF) [2] believe that missing energy has been accumulating at a considerable rate since 2005. According to their rough graph, as of 2010 the missing energy production rate is about 1.0 W/m2, which represents the difference between FTOA ~ 1.4 and FOHC ~ 0.4 W/m2. It is clear that the TF [Trenberth & Fasullo] missing-energy problem is made much more severe if FOHC is negative or even zero. In our opinion, the missing energy problem is probably caused by a serious overestimate by TF of FTOA, which, they state, is most accurately determined by modeling.
In summary, we find that estimates of the recent (2003–2008) OHC rates of change are preponderantly negative. This does not support the existence of either a large positive radiative imbalance or a “missing energy.”

Monday, September 20, 2010

An essay by Richard Petschauer, Carbon Dioxide Heat Trapping: Merely a Bit Player in Global Warming, shows that IPCC estimates of climate sensitivity to CO2 are greatly exaggerated. The author utilizes the Spectracalc spectral analysis program to determine the ability of the 'greenhouse' gas CO2 to absorb the long wave infrared energy emitted from the Earth's surface (a stepwise spectral line-by-spectral line "integral" of the area under the curve). He finds that at present CO2 levels, only 7.4% of heat radiated from the Earth is absorbed by CO2 and that the vast majority of heat absorption is due to water vapor. Furthermore, a doubling of CO2 levels only results in the absorption by CO2 increasing to 8.0%, a 4x increase to 8.7%, and 8x increase to 9.6%. As a first approximation to determine 'climate sensitivity' (change in temperature due to a doubling of CO2), one might assume the total so-called 'greenhouse effect' of 33C is correct, multiply that by the % change in absorption due to doubled CO2 (0.6%), to arrive at a climate sensitivity of only 0.198C. This is a far cry from the climate sensitivity figures of 2 - 4.5C claimed by the IPCC. The empirical data to date also supports a climate sensitivity of <1C (refs 1,2,3,4,5,6...)

Region of interest of CO2 LWIR (long wave infrared) absorption

Blowup of region of interest showing overlap between CO2 and water vaporshaded area is absorbed by CO2 only

Blowup of above figure. Area with asterisk is absorbed by CO2 only.

ABSTRACT: New calculations show that doubling of carbon dioxide (CO2) will increase average global temperature by only about 1F (degrees Fahrenheit) or 0.55C (degrees Centigrade), much less than the range of 2C to 4.5C estimated by the United Nations International Panel on Climate Change (IPCC). These new calculations are based on NASA supported spectral calculations available on the Internet relating to greenhouse gases. The temperature increases are estimated to be somewhat more in winter in the colder climates because of reduced competing atmosphere water vapor, but smaller increases at other times and places. These calculations also estimate that a 10% increase of water vapor in the atmosphere, a stronger greenhouse gas than CO2, or a reduction in the average cloud cover of only about 2 percent, will increase global temperature about as much as doubling CO2 would. Each additional doubling of CO2 will cause further temperature increases about the same as that caused by the first doubling. Greenhouse gases, except water vapor, only trap heat at certain narrow wavelengths of infrared radiation related to their molecular structures. Data shows that present concentrations of CO2, a strong absorber, are already well above the saturation value at its principal wavelength, so increases in it have a relative small affect. These new calculations are based on atmospheric models of the energy absorption bandwidths of greenhouse gases coupled with Max Planck’s equations relating to infrared wavelength distributions. A new simple technique is also proposed in the appendix to measure actual trapped heat being radiated back from the atmosphere to the Earth. This can be used to evaluate validate various estimating models. It also indicates that the role of clouds and their height above the Earth may have a larger role than previously thought. Since clouds operate as both powerful heat-trapping agents, overriding others, and a reflector of the sun’s energy, they may be the key factor in the regulation of the average global temperature. At the present time, they are one of the least measured parameters in the computer models predicting future climate changes. Weather and climate forecasting considering all factors is very complex, thus this paper does not cover that subject. However it is felt that the simple role of long-term heat rises due to only CO2 changes is a much simpler process and better estimated by basic models as used herein. Certain shortcomings in the IPCC data and estimates, as reported by others, are also summarized. Based on this new information, recommendations are made regarding future U.S. energy policy. While it does appear that the recent years show a warming trend, the role of CO2 in this is very small, and perhaps beneficial in moderating winter temperatures in colder climates.h/t comment by Scottar alerting me to this essay

Every day we fail to take action, we export green jobs and our technological advantage to China. While some deniers claim there are no green jobs, just last week The New York Times pointed to this story:

“With erect posture and clear gray eyes, Chuck Provini still looks like the Marine who graduated from the Naval Academy in 1969 and was repeatedly decorated for bravery in Vietnam.”

“He fumes at strangers who call him a traitor for agreeing to manufacture in Zhuzhou, China, a new solar panel production device that his company developed in the United States.”

Bjorn Lomborg, author of The Skeptical Environmentalist, and who maintains that he is not a bjorn-again climate "denier," is nonetheless highly critical of the Obama administration climate policies. Lomborg states the Obama policies have the wrong priorities such as calling for reduced CO2 emissions to avoid extreme weather events and to avoid 'increasing' world hunger (both of which have been on a declining trend and would not be affected by CO2). Lomberg also notes the policies are based on 4 serious factual errors: 1) that famine is increasing, 2) sea levels are rising dangerously, 3) droughts are increasing, and 4) storms are growing stronger. For the papers refuting each of these fallacies and a summary of Lomborg's arguments, click on the graphic below:

Thursday, September 16, 2010

ABSTRACT: The long term trends in monthly minimum temperature from 34 California weather stations have been analyzed. These trends can be explained using a variable linear urban heat island effect superimposed on a baseline trend from the Pacific Decadal Oscillation (PDO). The majority of the prevailing California weather systems originate in the N. Pacific Ocean. The average minimum monthly temperature is a measure of the surface air temperature of these weather systems. Changes in minimum surface temperature are an indicator of changes in the temperature of the tropospheric air column, not the ground surface temperature. The PDO provides a baseline minimum temperature trend that defines the California climate variation. This allows urban heat island effects and other possible anomalous temperature measurement effects to be identified and investigated. Some of the rural weather stations showed no urban heat island effects. Stations located in urban areas showed heat island effects ranging from 0.01 to over 0.04 C.yr-1. The analysis of minimum temperature data using the PDO as a reference baseline has been demonstrated as a powerful technique for climate trend evaluation. This technique may be extended to other regions using the appropriate local ocean surface temperature reference. The analysis found no evidence for CO2 induced warming trends in the California data. This confirms prior ‘Null Hypothesis’ work that it is impossible for a 100 ppm increase in atmospheric CO2 concentration to cause any climate change.

CONCLUSIONS: The dominant factor that determines the climate of the State of California is the variation in N. Pacific Ocean temperatures related to the PDO. This has been clearly demonstrated by an analysis of the long term minimum temperature data from 34 widely spaced California weather stations. The PDO record provides a baseline that can be used to identify urban heat island effects and anomalous data in the station records. This provides a powerful technique for investigating climate change in California and may be extended to other Western States and other areas of the world where there is an ocean influence on the climate that may be used to provide a local reference. Unexplained ‘adjustments’ made to weather station records for use in climate trend analysis have now become a major concern.[7,8] This technique may also provide an independent reference for the analysis of climate trends in weather station data to detect such ‘adjustments’. This analysis used a simple linear fit to the station data. By combining the weather station data with other meteorological data and climate simulations, a more detailed analysis of the effect the PDO and other factors on the climate of the State of California may be performed. However, this is not a ‘one size fits all’ approach and each data set needs to be examined carefully on a case by case basis to evaluate all of the factors that may bias the data. These results also confirm earlier work which demonstrated that it was impossible for the observed changes in atmospheric CO2 concentration to cause any climate change.[2] There is no CO2 ‘signature’ in any of the temperature records that were analyzed. The recent decrease in the PDO with the triple peak ‘signature’ from 1985 onwards is clearly visible in most of the temperature data sets. Predictions for CO2 induced global warming indicate a monotonically increasing ‘equilibrium surface temperature’ for this period. The empirical concept of CO2 induced global warming has no basis in the physical reality of climate change.

September 9, 2010 | A team of scientists led by NCAR’s Keith Oleson has incorporated urban areas into a global climate model. The development is important because most models used for predicting future climate change do not account for the urban “heat island” effect. The study will be published in the International Journal of Climatology.

Oleson and colleagues used the Community Climate System Model, an NCAR-based model that uses trillions of calculations to simulate the chemical and physical processes that drive Earth’s climate. After inserting a parameterization for urban surfaces into the CCSM’s land surface component, the researchers ran the model from present day to 2100 under the Intergovernmental Panel on Climate Change (IPCC) A2 emissions scenario, which assumes that global fossil fuel emissions will continue to rise at high levels over the coming century.

Results from the modeling experiment show that present-day annual mean urban air temperatures are up to 4°C warmer than temperatures for surrounding rural areas, a finding that is important for verifying the model’s accuracy since scientists already have observational evidence that urban areas are warmer than surrounding rural areas.

The study found that both urban and rural areas warm substantially by the end of this century as emissions rise, with rural areas warming slightly more than urban—resulting in a decrease in the urban-to-rural contrast. In addition, nighttime urban warming is much greater than daytime urban warming, resulting in a reduced diurnal range in temperature compared to rural areas.

“This study demonstrates that climate models need to begin to account for urban surfaces to more realistically evaluate the impact of climate change on people in the environments where they live,” Oleson says.

This pair of satellite images provides two views of Atlanta, with the city’s urban core at the center of both images. The top image is a photo-like view of the area, while the bottom image is a land surface temperature map with cooler temperatures in yellow and hotter temperatures in red. Because vegetation cools Earth’s surface through evaporation, the most densely vegetated areas (darkest green in top image) are the coolest areas (palest yellow in bottom image).

Tuesday, September 14, 2010

There is a large cognitive dissonance required to be a true AGW believer, hence the comparison to religious beliefs. Take, for instance, the ability to simultaneously acknowledge that CO2 levels have been 10 to 20 times higher than the present during multiple periods of Earth's history without causing a 'tipping point' of no return, while retaining the belief that CO2 levels 10 to 20 times less are causing a 'tipping point' now. In fact, an entire ice age came and went with CO2 levels about 11 times higher than the present throughout the Ordovician period shown in the graphic below. The latest eco-scare-alert notes that Antarctica abruptly transitioned from a warm, subtropical hothouse to the present solid ice sheet during a period when CO2 levelsexceeded those of today by10 times.

Recent Antarctica research may provide critical clues to understanding one of the most dramatic periods of climatic change in Earth's history - and a glimpse into what might lie far ahead in the planet's climate’s future.

The giant ice sheets of Antarctica behave like mirrors, reflecting the sun's energy and moderating the world's temperatures. The waxing and waning of these ice sheets contribute to changes in sea level and affect ocean circulation, which regulates our climate by transporting heat around the planet.

Despite their present-day frigid temperatures, the poles were not always covered with ice. New climate records recovered from Antarctica during the recent Integrated Ocean Drilling Program "Wilkes Land Glacial History" Expedition show that approximately 53 million years ago, Antarctica was a warm, sub-tropical environment. During this same period, known as the "greenhouse" or "hothouse" world, atmospheric CO2 levels exceeded those of today by ten times.

Then suddenly, Antarctica's lush environment transitioned into its modern icy realm. In only 400,000 years concentrations of atmospheric carbon dioxide decreased. Global temperatures dropped. Ice sheets developed and Antarctica became ice-bound.

Dr. Ir. Noor van Andel, former head of research at Akzo Nobel, recently presented a talk at the Dutch Meteorological Institute KNMI, concluding there is

• No observational evidence for influence of CO2 on past or present climate

• Rising Outgoing Long-wave radiation with more than 3.7 W/m^2 per ºC SST cannot be the effect of rising CO2 or of the increase of other “greenhouse” gases. Rising OLR/SST with 8.6 W/m^2K means that the atmosphere has become more transparent to IR radiation in the past 60 years. The “greenhouse effect” has become less.

"It's like a time machine...the ice has not been this small for many, many centuries," said Lars Piloe, a Danish scientist heading a team of "snow patch archaeologists" on newly bare ground 1,850 metres (6,070 ft) above sea level in mid-Norway.

Specialised hunting sticks, bows and arrows and even a 3,400-year-old leather shoe have been among finds since 2006 from a melt in the Jotunheimen mountains, the home of the "Ice Giants" of Norse mythology.

As water streams off the Juvfonna ice field, Piloe and two other archaeologists -- working in a science opening up due to climate change -- collect "scare sticks" they reckon were set up 1,500 years ago in rows to drive reindeer towards archers.

But time is short as the Ice Giants' stronghold shrinks.

The heating & cooling of Greenland over the past 8000 years

"Our main focus is the rescue part," Piloe said on newly exposed rocks by the ice. "There are many ice patches. We can only cover a few...We know we are losing artefacts everywhere."

Freed from an ancient freeze, wood rots in a few years. And rarer feathers used on arrows, wool or leather crumble to dust in days unless taken to a laboratory and stored in a freezer.

Jotunheimen is unusual because so many finds are turning up at the same time -- 600 artefacts at Juvfonna alone.

Other finds have been made in glaciers or permafrost from Alaska to Siberia. Italy's iceman "Otzi", killed by an arrow wound 5,000 years ago, was found in an Alpine glacier in 1991. "Ice Mummies" have been discovered in the Andes.

Monday, September 13, 2010

Climate scientist Roger Pielke Sr. has an important post today by meteorologist William DiPuccio on the recent lack of ocean heat accumulation, which concludes, "It is evident that the AGW hypothesis, as it now stands, is either false or fundamentally inadequate. One may argue that projections for global warming are measured in decades rather than months or years, so not enough time has elapsed to falsify this hypothesis. This would be true if it were not for the enormous deficit of heat we have observed. In other words, no matter how much time has elapsed, if a projection misses its target by such a large magnitude (6x to 8x), we can safely assume that it is either false or seriously flawed."

"The current lapse in heat accumulation demonstrates a complete failure of the AGW hypothesis to account for natural climate variability, especially as it relates to ocean cycles (PDO, AMO, etc.). If anthropogenic forcing from GHG can be overwhelmed by natural fluctuations (which themselves are not fully understood), or even by other types of anthropogenic forcing, then it is not unreasonable to conclude that the IPCC models have little or no skill in projecting global and regional climate change on a multi-decadal scale. Dire warnings about “runaway warming” and climate “tipping points” cannot be taken seriously. A complete rejection of the hypothesis, in its current form, would certainly be warranted if the ocean continues to cool (or fails to warm) for the next few years."

"Open and honest debate has been marginalized by appeals to consensus. But as history has often shown, consensus is the last refuge of poor science."

Sunday, September 12, 2010

A paper in process examines the global warming alarmist movement and finds 26 other analogous false alarms that were endorsed byscientists, politicians, and the media. In each case, the analogous alarms were presented as “scientific,” but none were based on scientific forecasting procedures. Every alarming forecast proved to be false; the predicted adverse effects either did not occur or were minor. However, costly government policies remained in place long after the predicted disasters failed to materialize. The authors find the current global warming alarm is simply the latest example of a common social phenomenon: an alarm based on unscientific forecasts of a calamity. They conclude that the global warming alarm will fade, but not before much additional harm is done by governments and individuals making inferior decisions on the basis of unscientific forecasts.

Effects and outcomes of the global warming alarm: A forecasting project using the structured analogies methodKesten C. Green
International Graduate School of Business, University of South Australia
J. Scott Armstrong
The Wharton School, University of PennsylvaniaABSTRACT: We summarize evidence showing that the global warming alarm movement has more of the character of a political movement than that of a scientific controversy. We then make forecasts of the effects and outcomes of this movement using a structured analysis of analogous situations—a method that has been shown to produce accurate forecasts for conflict situations. This paper summarizes the current status of this “structured analogies project.”
We searched the literature and asked diverse experts to identify phenomena that could be characterized as alarms warning of future disasters that were endorsed by scientists, politicians, and the media, and that were accompanied by calls for strong action. The search yielded 71 possible analogies. We examined objective accounts to screen the possible analogies and found that 26 met all criteria. We coded each for forecasting procedures used, the accuracy of the forecasts, the types of actions called for, and the effects of actions implemented.
Our preliminary findings are that analogous alarms were presented as “scientific,” but none were based on scientific forecasting procedures. Every alarming forecast proved to be false; the predicted adverse effects either did not occur or were minor. Costly government policies remained in place long after the predicted disasters failed to materialize. The government policies failed to prevent ill effects.
The findings appear to be insensitive to which analogies are included. The structured analogies approach suggests that the current global warming alarm is simply the latest example of a common social phenomenon: an alarm based on unscientific forecasts of a calamity. We conclude that the global warming alarm will fade, but not before much additional harm is done by governments and individuals making inferior decisions on the basis of unscientific forecasts.

Analysis of temperature and salinity shakes view of global water flow.

A better understanding of how vast tracts of water move through the oceans could improve climate models.

The Ocean conveyor belt model
questioned

The accepted picture of how a massive oceanic conveyor belt of water turns has been complicated by findings published today in Nature Geoscience. The results could help to boost the precision of climate-change models.

As tropical water from the Equator flows north in the Atlantic Ocean, it becomes cooler and denser. Evaporation along the way makes it saltier and further increases its density. In the frigid Arctic, the water sinks into the depths and then moves southward; returning to the surface once it has warmed up again.

But this simplified picture of what is known as meridional overturning circulation (MOC) has been brought into question by a paper suggesting that, in the past 50 years, ocean circulation closer to the Equator has grown weaker, whereas the northern waters have flowed more strongly.

"The more we look, the more complicated the ocean is," says Susan Lozier, an oceanographer at Duke University in Durham, North Carolina, and lead author of the study.

When the conveyor-belt model was conceptualized in the 1980s, researchers understood only a rough outline of overall marine currents, she says. Because it is difficult to take measurements in the depths of the ocean, MOC models couldn't reflect the intricacy of all the factors involved.

The idea that Oceanic water turns over like a conveyor belt has been questioned.

Thursday, September 9, 2010

Another essay by James Nash, a climate scientist for an environmental organization specializing in carbon offsets. Does he still have a job?

Global Warming: Silencing The Critics

A recent poll of 530 climatologists in 27 countries showed 34.7 percent of interviewees endorsed the notion that a substantial part of the current global warming trend – which might see temperatures rise by a degree or two, on average, by century’s end – is caused by man’s industrial activities: driving cars and the like.

More than a fifth – 20.5 percent – rejected this “anthropogenic hypothesis.” Half were undecided. The skeptics now include the 85 climate experts who signed the 1995 Leipzig Declaration; the 4,000 scientists from around the world (including 70 Nobel laureates) who signed the Heidelberg Appeal, and the 17,000 American scientists who signed the Oregon Petition.

Danish statistician Bjorn Lomborg, who bought the sky-is-falling scenario until he bothered to check some of the numbers, which led him to do his own research, at which point he wrote the book “The Skeptical Environmentalist” and became the Man The Greens Love to Hate, reminded the folks at Tech Central Station last November that most economists believe the projected level of warming would either have no effect or be beneficial.

Cold weather kills people, Lomborg reminded us. “It is estimated that climate change by about 2050 will mean about 800,000 fewer deaths.” And that’s before we even get around to increased food production. (If you want a real climate catastrophe, let’s talk about the next Ice Age, which is due relatively soon.)

What’s more, scientists at Ohio State University announced Feb. 12 that Antarctic “temperatures during the late 20th century did not climb as had been predicted by many global climate models.” In fact, they went down. So why would one get the sense from the daily barrage of electronic news that “all experts now agree” the earth is heating catastrophically, and that mankind’s use of fossil fuels is at fault?

﻿﻿
An editorial published in Science today, Farewell to Fossil Fuels?, offers "new insights into just how difficult it will be to say farewell to fossil fuels" to achieve the [unnecessary] reductions in CO2 emissions advocated by alarmists to meet the [fictitious & artificial] goal of less than 2 degrees global warming. That's putting it mildly, since as pointed out in an accompanying newspaper article, "The simple mathematics are that the world needs one nuclear-plant equivalent of carbon-free energy coming on line every day between now and midcentury to put global emissions on a trajectory that would meet the 2-degree goal." One new nuclear power plant coming online each and every single day until 2050 works out to about 14,600 give or take a few hundred. And nuclear is the only known, practical 'carbon free' energy source since,

"alternative energy sources, such as solar and wind electricity, are not adequate to achieve "massive market penetration," which requires utility-scale systems that can store intermittent supplies of power until they are needed. While Denmark and Norway have developed methods for this type of storage, these aren't "widely feasible in the United States, and other approaches to store power are expensive and need substantial research and testing"

Yahoo answers: How many square miles of Windmills equal 1 Nuclear power plant? Or, how many square miles of Solar Panels?Where are we going to put them?

A typical nuclear power plant produces 1,000 megwatts of electricity per hour.

At 25 megawatts to 1500 acres for a nice wind farm of 60 to 70 turbines, you would need 60,000 acres and 2400 to 2800 wind turbines to equal 1,000 megawatts. Of course, these wind turbines only produce that much power when the wind is blowing just right. That only happens about 25% of the time, so you really need four times as many wind turbines and four times as much space to produce, on average, 1,000 megawatts of electricity per hour. So that's, 240,000 acres and 9,600 to 11,200 turbines. 240,000 acres is 375 square miles.

At 5 acres of solar panels per megawatt, you need 5,000 acres of solar panels to equal 1,000 megawatts of electricity. Those solar panels only work at peak power levels during the sunny times, so, on average, they only put out about 25% of their rated capacity. That means you really need 20,000 acres of solar panels to generate 1,000 megwatts of electricity per hour, on average. 20,000 acres is 31.25 square miles.

We aren't going to put them anywhere. They are way too expensive and they don't provide a stable enough power supply to rely on. Anyplace with enough open spaces, enough wind or sun shine to be a good candidate is too far away from the east and west coasts where that power is needed most.

Doing the math, the wind turbine equivalent for 14,600 nuclear power plants would require ~5.475 million square miles containing ~146 million turbines. Or only 456,000 square miles of solar panels.

Abstract: One concrete goal adopted by some policy-makers is to reduce the risks associated with climate change by preventing the mean global temperature from rising by more than 2°C above preindustrial levels (1). Climate models indicate that achieving this goal will require limiting atmospheric carbon dioxide (CO2) concentrations to less than 450 parts per million (ppm), a level that implies substantial reductions in emissions from burning fossil fuels (2, 3). So far, however, efforts to curb emissions through regulation and international agreement haven't worked (4); emissions are rising faster than ever, and programs to scale up "carbon neutral" energy sources are moving slowly at best (5). On page 1330 of this issue, Davis et al. (6) offer new insights into just how difficult it will be to say farewell to fossil fuels.

The data shows a global mean sea level rise of only 1.85 mm/year since January 2005, a deceleration of 44% from the prior rate of 3.3 mm/year. At this rate, sea levels will rise 7 inches over the next 100 years. [Note also the bump in 2009-2010 is due to the temporary El Nino conditions] Break out the life rafts!

Rising sea level fantasy

A new assessment of global mean sea level from altimeters highlights a reduction of global trend from 2005 to 2008

Abstract. A new error budget assessment of the global Mean Sea Level (MSL) determined by TOPEX/Poseidon and Jason-1 altimeter satellites between January 1993 and June 2008 is presented. We discuss all potential errors affecting the calculation of the global MSL rate. We also compare altimetry-based sea level with tide gauge measurements over the altimetric period. This allows us to provide a realistic error budget of the MSL rise measured by satellite altimetry. These new calculations highlight a reduction in the rate of sea level rise since 2005, by ~2 mm/yr. This represents a 60% reduction compared to the 3.3 mm/yr sea level rise (glacial isostatic adjustment correction applied) measured between 1993 and 2005. Since November 2005, MSL is accurately measured by a single satellite, Jason-1. However the error analysis performed here indicates that the recent reduction in MSL rate is real.