Thursday, February 28, 2013

A forthcoming paper proposes the latest hair-brained geoengineering scheme to allegedly offset global warming. The authors suggest that transporting, at a minimum, 1 trillion kilograms of dust into space to form a "Sun-pointing elliptical Earth ring" will shield the Earth from solar radiation. The paper joins other geoengineering schemes including using a pipe to an airship to spew sulfuric acid into the upper atmosphere to "fix" the non-existent problem of man-made global warming.The authors are apparently unaware that clouds act as the Earth's negative feedback cooling mechanism and have maintained relatively stable Earth temperatures for millions of years without any evidence of net positive feedback "tipping points." Not to mention, the scheme is irreversible if solar activity enters a lull such as the Maunder or Dalton minimums, and is an irreversible hazard to satellites and space travel.

Publication date: 1 April 2013 [any coincidence that this is April fool's day?]

Source:Advances in Space Research, Volume 51, Issue 7

This paper examines the concept of a Sun-pointing elliptical Earth ring comprised of dust grains to offset global warming. A new family of non-Keplerian periodic orbits, under the effects of solar radiation pressure and the Earth’s J 2 oblateness perturbation, is used to increase the lifetime of the passive cloud of particles and, thus, increase the efficiency of this geoengineering strategy. An analytical model is used to predict the orbit evolution of the dust ring due to solar-radiation pressure and the J 2 effect. The attenuation of the solar radiation can then be calculated from the ring model. In comparison to circular orbits, eccentric orbits yield a more stable environment for small grain sizes and therefore achieve higher efficiencies when the orbit decay of the material is considered. Moreover, the novel orbital dynamics experienced by high area-to-mass ratio objects, influenced by solar radiation pressure and the J 2 effect, ensure the ring will maintain a permanent heliotropic shape, with dust spending the largest portion of time on the Sun facing side of the orbit. It is envisaged that small dust grains can be released from a circular generator orbit with an initial impulse to enter an eccentric orbit with Sun-facing apogee. Finally, a lowest estimate of 1×1012 kg of material is computed as the total mass required to offset the effects of global warming.

Highlights

► The feasibility of an Earth ring for climate engineering is analysed. ► Orbital dynamics of high area-to-mass ratio dust grains around Earth are modelled. ► The perturbations of solar radiation pressure and the J2 effect are included. ► Stable Sun-pointing elliptical orbits are utilised to form an Earth ring.

A paper published today in the Journal of Climate notes that "global dimming" from clouds/aerosols occurred from the 1950s to 1980s at a rate of -3.5 Wm-2 per decade, followed by "global brightening" from 1992 to 2002 at a rate of 6.6 Wm-2 per decade. By way of comparison, the IPCC claims* CO2 forcing since the 1950s was 1.18 Wm-2, and .25 Wm-2 from 1992 to 2002, 26 times less than the effect of "global brightening" during the same period. The global temperature record shows that temperatures correspond to these periods of global dimming and brightening rather than the slow steady rise of CO2 levels. It is thus apparent that clouds/aerosols are the "control knob" of climate, not man-made CO2.

The rate of warming increased by a factor of 3.8 from 1992 to 2002 corresponding to the period of "global brightening," and was followed by global cooling.

Surface incident solar radiation (G) determines our climate and environment. G has been widely observed with a single pyranometer since the late 1950s. Such observations have suggested a widespread decrease between the 1950s and 1980s (“global dimming”), i.e., at a rate of -3.5 W m−2 per decade (or -2% per decade) from 1960 to 1990. Since the early 1990s, the diffuse and direct components of G have been measured independently and a more accurate G was calculated by summing these two measurements. Data from this summation method have suggested that G has increased at a rate of 6.6 W m−2 per decade (3.6% per decade) from 1992 to 2002 (“brightening”) at selected sites. The brightening rates from these studies were also higher than those from a single pyranometer. In this paper, we used 17 years (1995-2011) parallel measurements by the two methods from nearly 50 stations to test whether these two measurement methods of G provide similar long-term trends. Our results show that although measurements of G by the two methods agree very well on a monthly time scale, the long-term trend from 1995 to 2011 determined by the single pyranometer is 2-4 W m−2 per decade less than that from the summation method. This difference of trends in the observed G is statistically significant. The dependence of trends of G on measurement methods uncovered here has an important implication for the widely reported “global dimming” and “brightening” based on datasets collected by different measurement methods, i.e., the dimming might have been less if measured with current summation methods.

Abstract

A peat record from Quincan Crater (Queensland, Australia), spanning the past 200 years, was used to test if hydrogen isotope ratios of leaf wax long-chain n-alkanes derived of higher plants can be used to reconstruct past tropical cyclone activity. Queensland is frequently impacted by tropical cyclones, with on average 1–2 hits per year. The most abundant n-alkanes in the peat are C29 and C31. Possible sources for long chain n-alkanes in the peat core are ferns and grasses, which grow directly on the peat layer, and the tropical forest growing on the crater rim. Hydrogen isotope ratios of C27, C29 and C31n-alkanes vary between − 155 and − 185 ‰ (VSMOW), with the largest variability in the upper 30 cm of the record. For the period 1950–2000 AD the variability in δD of C29 alkanes resembles a smoothed record of historical tropical cyclone frequency occurring within a 500 km radius from the site. This suggests that the high number of tropical cyclones occurring in this period strongly impacted the δD signal and on average resulted in more depleted values of precipitation. In the period before 1900 AD, the variability in the hydrogen isotope record is relatively small compared to the period 1950–2000 AD. This might be the result of lower variability of tropical cyclones during this time period. More likely, however, is that it results from the increasing age span per sampled interval resulting in a lower temporal resolution. Average δD values between 1900 and 2000 AD are around − 167‰, which is similar to average values found for the period between 1800 and 1900 AD. This suggests that on average tropical cyclone frequency did not change during the past 200 years. This study demonstrates the potential of stable hydrogen isotope ratios of long chain n-alkanes for the reconstruction of past tropical cyclone frequency.

Highlights

► δD ratios of long chain sedimentary n-alkanes from a peat core in northern Queensland were analyzed ► δD of the past 50 years resembles the historical record of tropical cyclone frequency in the area ► The results suggest that tropical cyclone frequency did not increase in the past 200 years

Sensitivity of Arctic clouds and radiation in the Community Atmospheric Model version 5 to the ice nucleation process is examined by testing a new physically based ice nucleation scheme that links the variation of ice nuclei (IN) number concentration to aerosol properties. The default scheme parameterizes the IN concentration simply as a function of ice supersaturation. The new scheme leads to a significant reduction in simulated IN number concentrations at all latitudes while changes in cloud amount and cloud properties are mainly seen in high latitudes and middle latitude storm tracks. In the Arctic, there is a considerable increase in mid-level clouds and a decrease in low clouds, which result from the complex interaction among the cloud macrophysics, microphysics, and the large-scale environment. The smaller IN concentrations result in an increase in liquid water path and a decrease in ice water path due to the slow-down of the Bergeron-Findeisen process in mixed-phase clouds. Overall, there is an increase in the optical depth of Arctic clouds, which leads to a stronger cloud radiative forcing (net cooling) at the top of the atmosphere.

The comparison with satellite data shows that the new scheme slightly improves low cloud simulations over most of the Arctic, but produces too many mid-level clouds. Considerable improvements are seen in the simulated low clouds and their properties when compared to Arctic ground-based measurements. Issues with the observations and the model-observation comparison in the Arctic region are discussed.

Wednesday, February 27, 2013

Increased Reliance on Wind, Solar Power Means Power Production Fluctuates

SAN FRANCISCO—California is weighing how to avoid a looming electricity crisis that could be brought on by its growing reliance on wind and solar power.

Regulators and energy companies met Tuesday, hoping to hash out a solution to the peculiar stresses placed on the state's network by sharp increases in wind and solar energy. Power production from renewable sources fluctuates wildly, depending on wind speeds and weather.

California has encouraged growth in solar and wind power to help reduce greenhouse-gas emissions. At the same time, the state is running low on conventional plants, such as those fueled by natural gas, that can adjust their output to keep the electric system stable. The amount of electricity being put on the grid must precisely match the amount being consumed or voltages sag, which could result in rolling blackouts.

At Tuesday's meeting, experts cautioned that the state could begin seeing problems with reliability as soon as 2015.

California isn't the only state having trouble coping with a growing share of renewables. Texas also needs more resources, such as gas-fired power plants, that can adjust output in response to unpredictable production from wind farms.

Renewable power has seen a boom in both states. On Feb. 9, wind farms in Texas set a record for output, providing nearly 28% of the state's supply for the day. Production hasn't hit that level yet in California, but the state's goal is to get one-third of its electricity from renewable resources by 2020.

"I think we're going to end up closer to 40%," said Robert Weisenmiller, chairman of the California Energy Commission, the state's policy and planning agency for electricity.

A decade ago, California was hit by an electricity crisis marked by price surges and rolling blackouts, stemming from market manipulation and tightening electricity supplies in a newly deregulated market. To prevent a recurrence, state regulators passed rules requiring utilities to line up enough energy to meet even high power demand, with a special emphasis on in-state renewable resources.

"California has been well served by the procurement process since the crisis," said Steve Berberich, chief executive of the California Independent System Operator, which runs the state's grid. "The problem is we have a system now that needs flexibility, not capacity."

Changes in California's market have attracted lots of new generation; the state expects to have 44% more generating capacity than it needs next year. Grid officials say they expect the surplus to fall to 20% by 2022, though it will remain high for about a decade.

However, the surplus generating capacity doesn't guarantee steady power flow. Even though California has a lot of plants, it doesn't have the right mix: Many of the solar and wind sources added in recent years have actually made the system more fragile, because they provide power intermittently.

Electricity systems need some surplus, so they can cover unexpected generator outages or transmission-line failures, but having too much can depress the prices generators can charge for electricity. In part because of low power prices, many gas-fired generation units aren't profitable enough to justify refurbishments required by pending federal regulations under the Clean Water Act. That means they are likely to be shut by 2020, adding to the state's power woes.

By July, state officials hope to have a plan in place addressing the problem. Turf issues among state and federal regulators could complicate the process.

Michael Peevey, president of the California Public Utilities Commission, which regulates utilities, said action is clearly needed, but he isn't sure whether the market needs "small adjustments or a major overhaul."

Utility executives are calling for immediate action, pointing to the risk of rolling blackouts. "We see the issue hitting as soon as 2013, 2014, 2015," said Todd Strauss, the head of planning and analysis for PG&E Corp., a big utility serving Northern California, who attended Tuesday's meeting. "If we thought it was far out, we wouldn't be here."

Tuesday, February 26, 2013

A new paper published in Geophysical Journal International finds that during the last interglacial, global sea levels rose more than twice as fast as the present rate, to more than 8 meters higher than the present. According to the authors, the maximum 1000-year-average rate of sea level rise during the last interglacial exceeded 6 mm/yr, which is double the rate claimed by the IPCC of 3.1 mm/yr, and 5 times the rate claimed by NOAA of ~ 1.2 mm/yr. The paper adds to many other peer-reviewed studies demonstrating there is nothing unusual, unnatural, or unprecedented regarding current sea level rise, and that there is no evidence of a human influence on sea levels.Full paper here

note higstand figures and 1000-yr-average global sea level rise derived from text in section 3.1 using the average of the two most likely estimates.

The last interglacial stage (LIG; ca. 130–115 ka) provides a relatively recent example of a world with both poles characterized by greater-than-Holocene temperatures similar to those expected later in this century under a range of greenhouse gas emission scenarios. Previous analyses inferred that LIG mean global sea level (GSL) peaked 6–9 m higher than today. Here, we extend our earlier work to perform a probabilistic assessment of sea level variability within the LIG highstand. Using the terminology for probability employed in the Intergovernmental Panel on Climate Change assessment reports, we find it extremely likely (95 per cent probability) that the palaeo-sea level record allows resolution of at least two intra-LIG sea level peaks and likely (67 per cent probability) that the magnitude of low-to-high swings exceeded 4 m. Moreover, it is likely that there was a period during the LIG in which GSL rose at a 1000-yr average rate exceeding 3 m kyr−1, but unlikely (33 per cent probability) that the rate exceeded 7 m kyr−1 and extremely unlikely (5 per cent probability) that it exceeded 11 m kyr−1. These rate estimates can provide insight into rates of Greenland and/or Antarctic melt under climate conditions partially analogous to those expected in the 21st century.

Sequestration already appears to have significant positive effects for the economy, as the massive 2% cut in the EPA budget this year leads to the rogue agency warning that staff furloughs "could create slowdowns in some of the agency's ongoing projects at a time when Obama has signaled that federal agencies will play a leading role in carrying out his promise to respond to the threat of climate change."

WASHINGTON (Reuters) - The acting head of the U.S. Environmental Protection Agency (EPA) warned staff on Tuesday that it may place an unspecified number of jobs on temporary furlough if across-the-board federal budget cuts take effect at the end of this week.

Bob Perciasepe, acting administrator of the EPA, wrote in an email that despite taking early measures to cut agency spending on contracts, grants and administration in recent months, furloughs are inevitable.

"Even with these actions, the arbitrary nature of the required budget cuts of sequestration would force us to implement employee furloughs over the remainder of the fiscal year, ending on September 30, 2013," Perciasepe wrote.

President Barack Obama and lawmakers in Congress have yet to resolve how to avoid the deep [2% is "deep"?] automatic spending cuts due on March 1, known as "sequestration."

Perciasepe said the agency will provide employees with 30 days notice before any furlough process begins. The EPA will try to minimize the burden on staff while trying to meet its regulatory obligations, he added.

The agency is also meeting with its national unions to prepare a plan, Perciasepe said.

The Energy Department warned workers of furloughs on February 7.

Furloughs at the EPA could create slowdowns in some of the agency's ongoing projects at a time when Obama has signalled that federal agencies will play a leading role in carrying out his promise to respond to the threat of climate change.

Among other things, the EPA is due to finalize rules to reduce greenhouse gas emissions from new power plants within a few months.

Two wind turbines towering above the Cape Cod community of Falmouth, Mass., were intended to produce green energy and savings -- but they've created angst and division, and may now be removed at a high cost as neighbors complain of noise and illness.

"It gets to be jet-engine loud," said Falmouth resident Neil Andersen. He and his wife Betsy live just a quarter mile from one of the turbines. They say the impact on their health has been devastating. They're suffering headaches, dizziness and sleep deprivation and often seek to escape the property where they've lived for more than 20 years.

"Every time the blade has a downward motion it gives off a tremendous energy, gives off a pulse," said Andersen. "And that pulse, it gets into your tubular organs, chest cavity, mimics a heartbeat, gives you headaches. It's extremely disturbing and it gets to the point where you have to leave."

The first turbine went up in 2010 and by the time both were in place on the industrial site of the town's water treatment facility, the price was $10 million. Town officials say taking them down will cost an estimated $5 million to $15 million, but that is just what Falmouth's five selectmen have decided to move toward doing.

"The selectmen unanimously voted to remove them. We think it's the right thing to do, absolutely," Selectman David Braga said. "You can't put a monetary value on people's health and that's what's happened here. A lot of people are sick because of these."

Now the matter will go to a town meeting vote in April and could ultimately end up on the ballot during the municipal elections in May.

"It's highly likely that what the voters will be determining is are they willing to tax themselves at an appropriate amount to cover the cost and dismantle and shut down the turbines?" Falmouth Town Manager Julian Suso said.

In the meantime, the turbines are being run on a limited schedule as the selectmen respond to the concerns of nearby neighbors. The turbines only run during the day -- from 7 a.m. to 7 p.m. -- which means they're operating at a loss.

The dispute has been a bitter three-year battle in the seaside town where officials argue the project was thoroughly vetted, researched and put to public vote multiple times.

"To say 'let's let the voters decide' -- it sort of flies in the face of what we went through all these years," said Megan Amsler of the Falmouth Energy Committee.

2/26/13 - Today, the National Association of Manufacturers (NAM) released a study conducted by NERA Economic Consulting that shows a carbon tax would have a devastating impact on manufacturing and jobs. The report, titled Economic Outcomes of a U.S. Carbon Tax, found that levying such a tax would impact millions of jobs and result in higher prices for natural gas, electricity, gasoline and other energy commodities. Manufacturing output in energy-intensive sectors could drop by as much as 15.0 percent and in non-energy-intensive sectors by as much as 7.7 percent.

“The notion that some policymakers have in Washington that an economy-wide tax of this nature is a good idea is flatly wrong,” said NAM President and CEO Jay Timmons. “Our nation’s economy and family budgets can’t take it. As consumers of one-third of our nation’s energy supply, manufacturers and our employees will struggle with higher energy prices. A carbon tax will severely harm our ability to compete with other nations.”

Other key findings of the report include the following:

A carbon tax would lead to lower real wage rates because companies would have higher costs and lower labor productivity. Over time, workers’ incomes could decline relative to baseline levels by as much as 8.5 percent.

The impact on jobs would be substantial, with a loss of worker income equivalent to between 1.3 million and 1.5 million jobs in 2013 and between 3.8 million and 21 million by 2053.

Any revenue raised from the carbon tax would be far outweighed by the negative effects on the economy.

A carbon tax would have a negative effect on consumption, investment and jobs, resulting in lower federal revenue from taxes on capital and labor.

The increased costs of coal, natural gas and petroleum products due to a carbon tax would ripple throughout the economy, resulting in higher production costs and less spending on non-energy goods.

“For manufacturers, a carbon tax would cause a net negative impact on output and productivity as the higher energy costs it imposes would ripple through all their supply chains,” said NERA Senior Vice President Anne E. Smith who conducted the research for the NAM. “In turn, higher production costs and reduction in output would ripple through the rest of the economy, reducing household incomes and consumption. A carbon tax would negatively impact the U.S. economy as a whole under both scenarios examined in this study.”

The study looks at two carbon tax scenarios: one levied at $20 per ton increasing at 4 percent and the other designed to reduce carbon dioxide (CO2) emissions by 80 percent. Both cases would have a negative impact on the economy. Please click on the links for the executive summary and full report and for information on10 hard hit states.

Considered en masse, Vanuytrecht et al. determined that for an approximate 200-ppm increase in the air's CO2 concentration (the mean enhancement employed in the studies they analyzed), water productivity was improved by 23% in the case of above ground biomass production per unit of water lost to evapotranspiration, and by 27% in the case of above ground yield produced per unit of water lost to evapotranspiration, which two productivity increases would roughly correspond to enhancements of 34% and 40% for a 300-ppm increase in the atmosphere's CO2 concentration.

It is also important to note in this regard that although "the FACE technique avoids the potential limitations of (semi-) closed systems by studying the influence of elevated CO2 on crop growth in the field without chamber enclosure," as the team of Belgian researchers write, other studies have demonstrated a significant problem caused by the rapid (sub-minute) fluctuations of CO2 concentration about a target mean that are common to most FACE experiments, as described by Bunce (2011, 2012), who found most recently that total shoot biomass of vegetative cotton plants in a typical FACE study averaged 30% less than in a constantly-elevated CO2 treatment at 27 days after planting, while wheat grain yields were 12% less in a fluctuating CO2 treatment compared with a constant elevated CO2 concentration treatment.

Looking toward the future, getting higher crop yields per unit of water used in the process of obtaining them will be a key element of mankind's struggle to feed our ever-increasing numbers over the next four decades, when our food needs are expected to double (Parry and Hawkesford, 2010); and with both land and water shortages looming on the horizon, we are going to need all of the help we can possibly get to grow the extra needed food. Fortunately, the results of this meta-analysis coming out of Belgium point to one important avenue by which such very substantial help can come, but it will only come if the air's CO2 content is allowed to rise unimpeded.

The realistic limits on wind power are probably much lower than scientists have suggested, according to new research, so much so that the ability of wind turbines to have any serious impact on energy policy may well be in doubt. Even if money were no object, the human race would hit Peak Wind output at a much lower level than has previously been thought.

This new and gloomy analysis for global wind power comes from Professor David Keith of the Harvard School of Engineering and Applied Sciences. The prof and his collaborator, Professor Amanda Adams of North Carolina uni, have weighed into a row which has been taking place for some years between crusading pro-wind physicists and their critics.

The pro-wind boffins, led by such figures as Harvard enviro-prof Michael McElroy and Mark Jacobson of Stanford, have long contended that if there is any upper limit on the amount of energy that could be extracted from the Earth's winds it is well above the amount the human race requires. They further contend that extracting these vast amounts of power from the atmosphere will not have any serious impact on the world's climate.

Both these assertions, however, have been called into doubt - and the first one, that there's plenty of wind power to meet all human demands, is particularly shaky as it ignores the thorny issue of cost. McElroy, Jacobson and their allies tend to make wild assumptions - for instance that it would be feasible to distribute massive wind turbines across most or even all of the planet's surface.

Professor Keith has some scathing criticism for these ideas. To start with, he says that most large-scale wind potential calculations thus far have simply ignored the problem that the possible massive wind farms of the future are going to result in much less powerful winds for long distances behind them. He and Professor Adams write:

Estimates of global wind resource that ignore the impact of wind turbines on slowing the winds may substantially overestimate the total resource. In particular, the results from three studies that estimated wind power capacities of 56, 72 and 148 TW respectively appear to be substantial overestimates given the comparison between model results and the assumptions these studies made about power production densities ... To cite a specific example, Archer and Jacobson assumed a power production density of 4.3 W m-2 ... production densities are not likely to substantially exceed 1 W m-2 implying that Archer and Jacobson may overestimate capacity by roughly a factor of four.

Peak Wind

Keith and Adams are referring to Archer and Jacobson's paper last year [1], in which they suggested that a "practical" windpower system of the future - employing 4 million wind towers spread all round the world to avoid damage to the environment (!) - might yield average output of 7.5 terawatts over time.

Professor David Keith.

As we pointed out at the time - we not being top physicists here at the Reg, but at least knowing what a Watt is - this is actually far less energy than the human race now requires, and wildly less than the amount of energy it would require if it were to build and maintain a colossal worldwide grid of enormous steel and carbon towers sunk into heavy concrete foundations along with the necessary associated world-spanning interconnectors, grid extensions, transport access into remote wilderness etc etc.
Harvard uni now informs us:

Keith’s research has shown that the generating capacity of very large wind power installations (larger than 100 square kilometers) may peak at between 0.5 and 1 watts per square meter.

As opposed to the 4+ watts assumed by Archer and Jacobson. In other words we'll be hitting Peak Wind a lot sooner than anyone thought. Archer and Jacobson's ridiculous unbuildable world wind project - which seemed likely to cost substantially more than the entire human race's economic output - would actually produce as little as one-eighth of what they think: and that was only a quarter of the amount of power that the human race might reasonably ask for (ie, say two-thirds of what a present-day European uses for everyone). So it would be able to provide about 3 per cent of global energy requirements, or well under a terawatt.
And we have to bear in mind that in the real world things are much worse still for windy dreams. Professors Keith and Adams go on:

Total wind power capacity can — of course — be very large if one assumes that turbines are placed over the entire land surface or even over the land and ocean surface, but while these geophysical limits are scientifically interesting their relevance to energy policy is unclear.

More policy-driven wind power capacity estimates have restricted the area considered ... Yet these estimates have used power production densities that are several times larger than the wind power production limit of around 1 W m-2 ... It is therefore plausible that wind power capacity may be limited to an extent that is relevant to energy policy.

It should be made clear that Professor Keith is starting from the position that global warming is still definitely on (at some point, it has been stalled for well over a decade) and that humanity must go carbon-free or anyway carbon-very-low within a lifetime, generating "several tens of terawatts" of low to nil-carbon power on that timescale. The professor is merely pointing out that wind certainly can't do anything like the whole job in that scenario, and it may not actually be able to do very much at all.
“It’s worth asking about the scalability of each potential energy source," says the prof, "whether it can supply, say, 3 terawatts, which would be 10 percent of our global energy need, or whether it’s more like 0.3 terawatts and 1 percent."

It's definitely looking as though we would be hitting Peak Wind down at the low end of that range. And with wind very much the poster child of renewable power - it is cheap, scalable and practical compared to the other methods - that would seem to be the effective end for the dream of a renewables-powered future for humanity.

Monday, February 25, 2013

A new paper published in The Holocene finds the Antarctic interior was warmer than the present over a period lasting about 2000 years, from 4300 to 2250 years ago. The paper surveys 3 lakes within 10° of the South Pole and finds meltwater levels were up to 69.5 meters higher than the present, which the authors "interpreted as being the result of an increased number of meltwater events and/or degree-days above freezing, relative to the present." The paper adds to many other peer-reviewed papers demonstrating that Antarctica has warmed for prolonged periods to temperatures higher than the present, and that there is nothing unusual, unnatural, or unprecedented in regard to present day temperatures.

Abstract

We surveyed and dated the former shorelines of one lake in the Shackleton Range and two lakes in the Pensacola Mountains, situated inland of the Weddell Sea embayment Antarctica between 80° and 85°S. These are amongst the highest latitude lakes in the Antarctic and are located in areas where there is little or no Holocene climate and hydrological information. Surveys of the lake shorelines show that past water levels have been up to 15.7, 17.7 and 69.5 m higher than present in the three study lakes. AMS radiocarbon dating of lake-derived macrofossils showed that there was a sustained period of higher water levels from approximately 4300 and until sometime after 2250 cal. yr before the present. This is interpreted as being the result of an increased number of meltwater events and/or degree-days above freezing, relative to the present. The closest comparable ice cores from the Dominion Range in the Transantarctic Mountains (85°S, 166°E) and the Plateau Remote ice core on the continental East Antarctic Ice Sheet (84°S, 43°E) also provide some evidence of a warmer period beginning at c. 4000–3500 yr BP and ending after 2000–1500 yr BP, as does a synthesis of oxygen isotope data from five Antarctic ice cores. This suggests that the well-documented mid- to late-Holocene warm period, measured in many lake and marine sediments around the coast of Antarctica, extended into these regions of the continental interior.

A multi-criteria score-based method is developed to assess General Circulation Model (GCM) performance at the regional scale. Application of the method assessing 25 GCM's simulations of monthly mean sea level pressure (MSLP) and air temperature, and monthly and annual rainfall over the southeastern Australia region for 1960/1–1999/2000 indicate that GCMs usually simulate monthly temperature better than monthly rainfall and mean sea level pressure. For example, the mean observed annual temperature for the study region is 16.7 °C while the median and mean values of 25 GCMs are 16.8 and 16.9 °C respectively, and 24 GCMs (except BCC:CM1) can reproduce the annual cycle of temperature accurately, with a minimum correlation coefficient of 0.99. In contrast, the mean observed annual rainfall for the study region is 502 mm, whereas the GCM values vary from 195 to 807 mm, and 12 out of 25 GCMs produce a negative correlation coefficient of monthly rainfall annual cycle. However, GCMs overestimate trend magnitude for temperature, but underestimate for rainfall. The observed annual temperature trend is +0.007 °C/yr, while both the median and mean [modeled] values are +0.013 °C/yr, which is almost double the observed magnitude. The observed annual rainfall trend is +0.62 mm/a, while the median and mean values of 25 GCMs are 0.21 and 0.36 mm/a, respectively. This demonstrates the advantages of using multi-criteria to assess GCMs performance. The method developed in this study can easily be extended to different study regions and results can be used for better informed regional climate change impact analysis.

A guest post today at a Dutch climate blog by Dr. John Christy notes that climate "models, on average, depict the last 34 years as warming about 1.5 times what actually occurred" and that the model predictions diverge from observations by even more [2.5 times] in the atmospheric layers most affected by greenhouse gases [the missing 'hot spot']. According to Dr. Christy, "Since this increased warming in the upper layers is a signature of greenhouse gas forcing in models, and it is not observed, this raises questions about the ability of models to represent the true vertical heat flux processes of the atmosphere and thus to represent the climate impact of the extra greenhouses gases we are putting into the atmosphere" and "models, on average, have been overly sensitive" to the effect of greenhouse gases.

Of equal importance here are the magnitudes of the actual trends of the surface and troposphere. The average global surface trend for 90 model simulations for 1979-2012 (Climate Model Intercomparison Project 5 or CMIP-5 used for IPCC AR5) is +0.232 °C/decade. The average of the observations is +0.157 °C/decade. Therefore models, on average, depict the last 34 years as warming about 1.5 times what actually occurred. Santer et al. 2012 (for 1979-2011 model output) noted that a subset of CMIP-5 models produce warming in LT that is 1.9 times observed, and for a deeper layer of the atmosphere (mid-troposphere, surface to about 18 km) the models warm the air 2.5 times that of observations. These are significant differences, implying the climate sensitivity of models is too high.

Signature

All of the above addresses the two issues mentioned at the beginning. First, global climate models on average depict a relationship between the surface and upper air that is different than that observed, i.e. models depict an amplifying factor into the upper air that is greater than observed. Secondly, the average climate model depicts the warming rate since 1979 as much higher than observed with increasing discrepancies as the altitude increases (which is consistent with the first issue).

Since this increased warming in the upper layers is a signature of greenhouse gas forcing in models, and it is not observed, this raises questions about the ability of models to represent the true vertical heat flux processes of the atmosphere and thus to represent the climate impact of the extra greenhouses gases we are putting into the atmosphere. It is not hard to imagine that as the atmosphere is warmed by whatever means (i.e. extra greenhouse gases) that existing processes which naturally expel heat from the Earth (i.e. negative feedbacks) can be more vigorously engaged and counteract the direct warming of the forcing. This result is related to the idea of climate sensitivity, i.e. how sensitive is the surface temperature to higher greenhouse forcing, for which several recent publications suggest models, on average, have been overly sensitive.