Based upon 14 field surveys conducted between 2003-2008, the authors find "The estimate of annual sea–air CO2 fluxes showed that most areas of the South China Sea proper served as weak to moderate sources of the atmospheric CO2." " Overall the four [regions of the South China Sea] contributed (18 ± 10) × 1012 g C yr−1 to the atmospheric CO2."

Based upon 14 field surveys conducted between 2003 and 2008, we showed that the seasonal pattern of sea surface partial pressure of CO2 (pCO2) and sea–air CO2 fluxes differed among four different physical–biogeochemical domains in the South China Sea (SCS) proper. The four domains were located between 7 and 23° N and 110 and 121° E, covering a surface area of 1344 × 103 km2 and accounting for ~ 54% of the SCS proper. In the area off the Pearl River estuary, relatively low pCO2 values of 320 to 390 μatm were observed in all four seasons and both the biological productivity and CO2 uptake were enhanced in summer in the Pearl River plume waters. In the northern SCS slope/basin area, a typical seasonal cycle of relatively high pCO2 in the warm seasons and relatively low pCO2 in the cold seasons was revealed. In the central/southern SCS area, moderately high sea surface pCO2 values of 360 to 425 μatm were observed throughout the year. In the area west of the Luzon Strait, a major exchange pathway between the SCS and the Pacific Ocean, pCO2 was particularly dynamic in winter, when northeast monsoon induced upwelling events and strong outgassing of CO2. These episodic events might have dominated the annual sea–air CO2 flux in this particular area. The estimate of annual sea–air CO2 fluxes showed that most areas of the SCS proper served as weak to moderate sources of the atmospheric CO2, with sea–air CO2 flux values of 0.46 ± 0.43 mol m−2 yr−1 in the northern SCS slope/basin, 1.37 ± 0.55 mol m−2 yr−1 in the central/southern SCS, and 1.21 ± 1.48 mol m−2 yr−1 in the area west of the Luzon Strait. However, the annual sea–air CO2exchange was nearly in equilibrium (−0.44 ± 0.65 mol m−2 yr−1) in the area off the Pearl River estuary. Overall the four domains contributed (18 ± 10) × 1012 g C yr−1 to the atmospheric CO2.

Thursday, November 28, 2013

A new paper published in Geophysical Research Letters demonstrates the pathetic state of climate modelling of clouds. According to the paper, the only way for the modellers to reproduce cloud effects on observed temperatures was to use microphysical properties of clouds that were the most divergent from satellite observations. In other words, to "tune" the model to reproduce one observation [temperature] results in detuning and non-reproduction of another observation [cloud microphysical properties]. The models thus remain in a state of infancy regarding clouds, one of the most important parameters required for climate projections. A mere 1 to 2% cloud modelling error can alone account for global warming or cooling, and these computer modelling games are nowhere close to achieving such a level of accuracy.

According to Suzuki et al. (2013), "climate models contain various uncertain parameters in the formulations of parameterizations for physical processes," and they say that "these parameters represent 'tunable knobs' that are typically adjusted to let the models reproduce realistic values of key-observed climate variables."

Against this backdrop, Suzuki et al. examined "the validity of a tunable cloud parameter, the threshold particle radius triggering the warm rain formation in a climate model." And the model they chose for this purpose was the Geophysical Fluid Dynamics Laboratory (GFDL) Coupled Climate Model version 3 (CM3), because it is known that alternate values of that model's tunable cloud parameter that fall within its real-world range of uncertainty "have been shown to produce severely different historical temperature trends due to differing magnitudes of aerosol [cloud] indirect forcing."

The results of the three researchers' analysis indicated that "the simulated temperature trend best matches [the] observed trend when the model adopts the threshold radius that worst reproduces satellite-observed microphysical statistics and vice versa." Of this finding the three researchers state, "this inconsistency between the 'bottom-up' process-based constraint and the 'top-down' temperature trend constraint implies the presence of compensating errors in the model." And they note that "if this behavior is not a peculiarity of the GFDL CM3, the contradiction may be occurring in other climate models as well," which is not what one would want to find.

Wednesday, November 27, 2013

Nov. 27, 2013 — The study, published in Geophysical Research Letters, discovered two subglacial lakes 800 metres below the Greenland Ice Sheet. The two lakes are each roughly 8-10 km2, and at one point may have been up to three times larger than their current size.

Subglacial lakes are likely to influence the flow of the ice sheet, impacting global sea level change. The discovery of the lakes in Greenland will also help researchers to understand how the ice will respond to changing environmental conditions.The study, conducted at the Scott Polar Research Institute (SPRI) at the University of Cambridge, used airborne radar measurements to reveal the lakes underneath the ice sheet.Lead author Dr Steven Palmer, formerly of SPRI and now at the University of Exeter, stated: "Our results show that subglacial lakes exist in Greenland, and that they form an important part of the ice sheet's plumbing system. Because the way in which water moves beneath ice sheets strongly affects ice flow speeds, improved understanding of these lakes will allow us to predict more accurately how the ice sheet will respond to anticipated future warming."The lakes are unusual compared with those detected beneath Antarctic ice sheets, suggesting that they formed in a different manner. The researchers propose that, unlike in Antarctica where surface temperatures remain below freezing all year round, the newly discovered lakes are most likely fed by melting surface water draining through cracks in the ice. A surface lake situated nearby may also replenish the subglacial lakes during warm summers.This means that the lakes are part of an open system and are connected to the surface, which is different from Antarctic lakes that are most often isolated ecosystems.While nearly 400 lakes have been detected beneath the Antarctic ice sheets, these are the first to be identified in Greenland. The apparent absence of lakes in Greenland had previously been explained by the fact that steeper ice surface in Greenland leads to any water below the ice being 'squeezed out' to the margin.The ice in Greenland is also thinner than that in Antarctica, resulting in colder temperatures at the base of the ice sheet. This means that any lakes that may have previously existed would have frozen relatively quickly. The thicker Antarctic ice can act like an insulating blanket, preventing the freezing of water trapped underneath the surface.As many surface melt-water lakes form each summer around the Greenland ice sheet, the possibility exists that similar subglacial lakes may be found elsewhere in Greenland. The way in which water flows beneath the ice sheet strongly influences the speed of ice flow, so the existence of other lakes will have implications for the future of the ice sheet.Related:

The US good luck with respect to hurricane landfalls -- yes, good luck -- continues. The graph below shows total US hurricane landfalls 1900 through 2013.

The five-year period ending 2013 has seen 2 hurricane landfalls. That is a record low since 1900. Two other five-year periods have seen 3 landfalls (years ending in 1984 and 1994). Prior to 1970 the fewest landfalls over a five-year period was 6. From 1940 to 1957, every 5-year period had more than 10 hurricane landfalls (1904-1920 was almost as active).

The red line in the graph above shows a decrease in the number of US landfalls of more than 25% since (which given variability, may just be an artifact and not reflecting a secular change). There is no evidence to support more or more intense US hurricanes. The data actually suggests much the opposite.

If you are interested in a global perspective, Ryan Maue keeps excellent data. Here is his latest graph on global ACE (accumulated cyclone energy, an overall measure of storm intensity).

New Report Concludes That Extreme Weather Events Are Not Increasing

A Review Of The State Of Science

A new report published today by the Global Warming Policy Foundation concludes that there has been no increase in extreme weather events in recent decades.

Whenever an extreme weather event (such as a heat-wave, a flood, a drought or a tropical storm) is widely reported by the news media, a heated debate about its possible link with global warming is set off.

The latest example of this kind of speculation was triggered by the disastrous typhoon Haiyan that killed thousands of people in the Philippines in early November.

In his report The Global Warming-Extreme Weather Link: A Review Of The State Of Science Dr Madhav Khandekar, a former meteorologist from Environment Canada, examines several recent extreme weather events and discusses them the context of the ongoing climate debate.

Tuesday, November 26, 2013

Meteorologists’ views about global warming:A survey of American Meteorological Societyprofessional members

Meteorologists and other atmospheric science experts are playing important roles in helping society respond to climate change. Members of this professional community are not unanimous in their views of climate change, and there has been tension among members of the American Meteorological Society (AMS) who hold different views on the topic.

In January 2012, the AMS surveyed its members via email and found 52 percent believe global warming is happening and is mostly human-caused, while 48 percent do not.The survey also found that scientists with professed liberal political views were far more likely to believe global warming is human-caused than others.

Authors of the survey recommended that the AMS should “acknowledge and explore the uncomfortable fact that political ideology influences the climate change views of meteorology professionals; refute the idea that those who do hold non-majority views just need to be “educated” about climate change; [and] continue to deal with the conflict among members of the meteorology community.”

After confidently predicting that the 2013 hurricane season had only a 5% chance of being a below-normal season, NOAA admits in a press release yesterday that 2013 was much below normal, with the fewest number of hurricanes since 1982, and zero major hurricanes forming in the Atlantic basin.

No major hurricanes formed in the Atlantic basin - first time since 1994

The 2013 Atlantic hurricane season, which officially ends on Saturday, Nov. 30, had the fewest number of hurricanes since 1982, thanks in large part to persistent, unfavorable atmospheric conditions over the Gulf of Mexico, Caribbean Sea, and tropical Atlantic Ocean. This year is expected to rank as the sixth-least-active Atlantic hurricane season since 1950, in terms of the collective strength and duration of named storms and hurricanes.

“A combination of conditions acted to offset several climate patterns that historically have produced active hurricane seasons,” said Gerry Bell, Ph.D., lead seasonal hurricane forecaster at NOAA’s Climate Prediction Center, a division of the National Weather Service. “As a result, we did not see the large numbers of hurricanes that typically accompany these climate patterns.”

Thirteen named storms formed in the Atlantic basin this year. Two, Ingrid and Humberto, became hurricanes, but neither became major hurricanes. Although the number of named storms was above the average of 12, the numbers of hurricanes and major hurricanes were well below their averages of six and three, respectively. Major hurricanes are categories 3 and above.

Suomi NPP satellite peers into Tropical Storm Andrea, the first storm of the season.

Tropical storm Andrea, the first of the season, was the only named storm to make landfall in the United States this year. Andrea brought tornadoes, heavy rain, and minor flooding to portions of Florida, eastern Georgia and eastern South Carolina, causing one fatality.

The 2013 hurricane season was only the third below-normal season in the last 19 years, since 1995, when the current high-activity era for Atlantic hurricanes began.

“This unexpectedly low activity is linked to an unpredictable atmospheric pattern that prevented the growth of storms by producing exceptionally dry, sinking air and strong vertical wind shear in much of the main hurricane formation region, which spans the tropical Atlantic Ocean and Caribbean Sea,” said Bell. “Also detrimental to some tropical cyclones this year were several strong outbreaks of dry and stable air that originated over Africa.”

Unlike the U.S., which was largely spared this year, Mexico was battered by eight storms, including three from the Atlantic basin and five from the eastern North Pacific. Of these eight landfalling systems, five struck as tropical storms and three as hurricanes.

NOAA and the U.S. Air Force Reserve flew 45 hurricane hunter aircraft reconnaissance missions over the Atlantic basin this season, totaling 435 hours--the fewest number of flight hours since at least 1966.

NOAA will issue its 2014 Atlantic Hurricane Outlook in late May, prior to the start of the season on June 1.

Flashback from NOAA 5/23/13:

2013 Atlantic Hurricane Season Outlook: Summary

NOAA’s 2013 Atlantic Hurricane Season Outlook indicates that an above-normal season is most likely, with the possibility that the season could be very active. The outlook calls for a 70% chance of an above-normal season, a 25% chance of a near-normal season, and only a 5% chance of a below-normal season. See NOAA definitions of above-, near-, and below-normal seasons, which have been slightly modified from previous years. The Atlantic hurricane region includes the North Atlantic Ocean, Caribbean Sea, and Gulf of Mexico.

This combination of climate factors historically produces above-normal Atlantic hurricane seasons. The 2013 hurricane season could see activity comparable to some of the very active seasons since 1995.

Based on the current and expected conditions, combined with model forecasts, we estimate a 70% probability for each of the following ranges of activity during 2013:

13-20 Named Storms

7-11 Hurricanes

3-6 Major Hurricanes

Accumulated Cyclone Energy (ACE) range of 120%-205%

The seasonal activity is expected to fall within these ranges in 70% of seasons with similar climate conditions and uncertainties to those expected this year. These ranges do not represent the total possible ranges of activity seen in past similar years.

Note that the expected ranges are centered well above the official NHC 1981-2010 seasonal averages of 12 named storms, 6 hurricanes, and 3 major hurricanes.

NOAA predicts active 2013 Atlantic hurricane season

Era of high activity for Atlantic hurricanes continues

May 23, 2013

Hurricane Sandy as seen from NOAA's GOES-13 satellite on October 28, 2012.

For the six-month hurricane season, which begins June 1, NOAA’s Atlantic Hurricane Season Outlook says there is a 70 percent likelihood of 13 to 20 named storms (winds of 39 mph or higher), of which 7 to 11 could become hurricanes (winds of 74 mph or higher), including 3 to 6 major hurricanes (Category 3, 4 or 5; winds of 111 mph or higher).

These ranges are well above the seasonal average of 12 named storms, 6 hurricanes and 3 major hurricanes.

“With the devastation of Sandy fresh in our minds, and another active season predicted, everyone at NOAA is committed to providing life-saving forecasts in the face of these storms and ensuring that Americans are prepared and ready ahead of time.” said Kathryn Sullivan, Ph.D., NOAA acting administrator. “As we saw first-hand with Sandy, it’s important to remember that tropical storm and hurricane impacts are not limited to the coastline. Strong winds, torrential rain, flooding, and tornadoes often threaten inland areas far from where the storm first makes landfall.”

Three climate factors that strongly control Atlantic hurricane activity are expected to come together to produce an active or extremely active 2013 hurricane season. These are:

A continuation of the atmospheric climate pattern, which includes a strong west African monsoon, that is responsible for the ongoing era of high activity for Atlantic hurricanes that began in 1995;

Warmer-than-average water temperatures in the tropical Atlantic Ocean and Caribbean Sea; and

Hurricanes used to hit the US almost every year, but in recent years hurricanes have been much less common. The US was hit by at least one hurricane every year from 1938 to 1950, and again from 1952 to 1961. By contrast, there were no US hurricane strikes in 2006, 2009, 2010 or 2013.

Monday, November 25, 2013

A new post by Dan Pangburn shows that solar activity explains 95% of global temperature change over the past 403 years since 1610, including the recovery from the Little Ice Age. Change to the level of atmospheric carbon dioxide was found to have no significant influence.

This monograph is a clarification and further refinement of Reference 10 which also considers only average global temperature. It does not discuss weather, which is a complex study of energy moving about the planet. It does not even address local climate, which includes precipitation. It does, however, consider the issue of Global Warming and the mistaken perception that human activity has a significant influence on it.

The word ‘trend’ is used here for measured temperatures in two different contexts. To differentiate, α-trend applies to averaging-out the uncertainties in reported average global temperature measurements to produce the average global temperature oscillation resulting from the net ocean surface oscillation. The term β-trend applies to averaging-out the average global temperature oscillation to produce the slower average temperature change of the planet which is associated with change to the temperature of the bulk volume of the water involved.

The first paper to suggest the hypothesis that the sunspot number time-integral is a proxy for a substantial driver of average global temperature change was made public 6/1/2009. The discovery started with application of the first law of thermodynamics, conservation of energy, and the hypothesis that the energy acquired, above or below breakeven (appropriately accounting for energy radiated from the planet), is proportional to the time-integral of sunspot numbers. The derived equation revealed a rapid and sustained global energy rise starting in about 1941. The average global temperature anomaly change β-trend is proportional to global energy change.

Measured temperature anomaly α-trends oscillate above and below the temperature anomaly trend calculated using only the sunspot number time-integral. The existence of ocean oscillations, especially the Pacific Decadal Oscillation, led to the perception that there must be an effective net surface temperature oscillation with all named and unnamed ocean oscillations as participants. Plots of measured average global temperatures indicate that the net surface temperature oscillation must have a period of 64 years with the most recent maximum in 2005.

Combination of the effects results in the effect of the ocean surface temperature oscillation (α-trend) decline 1941-1973 being slightly stronger than the effect of the rapid rise from sunspots (β-trend) resulting in a slight decline of the trend of reported average global temperatures. The steep rise 1973-2005 occurred because the effects added. A high coefficient of determination, R2, demonstrates that the hypothesis is true.

Several refinements to this work slightly improved the accuracy and led to the equations and figures in this paper.

Prior work

The law of conservation of energy is applied as described in Reference 1 in the development of the equations that calculate temperature anomalies.

Change to the level of atmospheric carbon dioxide has no significant effect on average global temperature. This was demonstrated in 2008 at Reference 6 and is corroborated at Reference 2.

Global Warming ended more than a decade ago as shown in Reference 4 and Reference 2.

Average global temperature is very sensitive to cloud change as shown in Reference 5.

The parameter for average sunspot number was 43.97 (average 1850-1940) in Ref. 1, 42 (average 1895-1940) in Ref. 9, and 40 (average 1610-2012) in Ref. 10. It is set at 34 (average 1610-1940) in this paper. The procession of values for average sunspot number produces slight but steady improvement in R2 for the period of measured temperatures and progressively greater credibility of average global temperature estimates for the period prior to direct measurements becoming available. A graph of R2 vs. average sunspot number indicates that further lowering of the number would not significantly increase R2 and might even reduce it.

It is axiomatic that change to the average temperature trend of the planet is due to change from break-even to the net energy retained by the planet.

Table 1 in reference 2 shows the influence of atmospheric CO2 to be insignificant (tiny change in R2 if considering CO2 or not) so it can be removed from the equation by setting coefficient ‘C’ to zero. With ‘C’ set to zero, Equation 1 in Reference 2 calculates average global temperature anomalies (AGT) since 1895 with 89.82% accuracy (R2 = 0.898220).

The current analysis determined that 34, the approximate average of sunspot numbers from 1610-1940, provides a slightly better fit to the measured temperature data than did 43.97 and other values 9,10. The approximate AGT during 1610-1940 is 286.2 K. With these refinements to Equation (1) in Reference 1 the coefficients become A = 0.3596, B = 0.003503 and D = ‑ 0.4475. R2 increases slightly to 0.904839 and the calculated anomaly in 2005 is 0.5046 K. Also with these refinements the equation calculates lower early anomalies and projects slightly higher future anomalies. The excellent match of the up and down trends since before 1900 of calculated and measured anomalies, shown here in Figure 1, corroborates the usefulness and validity of the calculations.

Projections until 2020 use the expected sunspot number trend for the remainder of solar cycle 24 as provided 11 by NASA. After 2020 the limiting cases are either assuming sunspots like from 1925 to 1941 or for the case of no sunspots which is similar to the Maunder Minimum.

Some noteworthy volcanos and the year they occurred are also shown on Figure 1. No consistent AGT response is observed to be associated with these. Any global temperature perturbation that might have been caused by volcanos of this size is lost in the temperature measurement uncertainty. Much larger volcanoes can cause significant temporary global cooling from the added reflectivity of aerosols and airborne particulates. The Tambora eruption, which started on April 10, 1815 and continued to erupt for at least 6 months, was approximately ten times the magnitude of the next largest in recorded history and led to 1816 which has been referred to as ‘the year without a summer’. The cooling effect of that volcano exacerbated the already cool temperatures associated with the Dalton Minimum.

(Click on image or equation to enlarge)

Figure 1: Measured average global temperature anomalies with calculated prior and future trends using 34 as the average daily sunspot number.

As discussed in Reference 2, ocean oscillations produce oscillations of the surface temperature with no significant change to the average temperature of the bulk volume of water involved. The effect on AGT of the full range of surface temperature oscillation is given by the coefficient ‘A’.

The influence of ocean surface temperature oscillations can be removed from the equation by setting ‘A’ to zero. To use all regularly recorded sunspot numbers, the integration starts in 1610. The offset, ‘D’ must be changed to -0.2223 to account for the different integration start point and setting ‘A’ to zero. Setting ‘A’ to zero requires that the anomaly in 2005 be 0.5046 - 0.3596/2 = 0.3248 K. The result, Equation (1) here, then calculates the trend 1610-2012 resulting from just the sunspot number time-integral.

-0.2223 is merely an offset that shifts the calculated trajectory vertically on the graph, without changing its shape, so that the calculated temperature anomaly in 2005 is 0.3248 K which is the calculated anomaly for 2005 if the ocean oscillation is not included.

Figure 2: Anomaly trend from just the sunspot number time-integral using Equation (1).

Combined Sunspot Effect and Ocean Oscillation Effect

Average global temperatures were not directly measured in 1610 (thermometers had not been invented yet) or even estimated very accurately using proxies. The anomaly trend that Equation (1) calculates for that time is roughly consistent with other estimates but cannot be verified. Also, there is no way to determine for sure how much and which way the ocean surface temperature cycles would influence the values.

As a possibility, the period and amplitude of oscillations attributed to ocean cycles demonstrated to be valid after 1895 are assumed to maintain back to 1610. Equation (1) is modified as shown in Equation (2) to account for including the effects of ocean oscillations. Since the expression for the oscillations calculates values from zero to the full range but oscillations must be centered on zero, it must be reduced by half the oscillation range.

(2)

The ocean oscillation factor, (0.3596,y) – 0.1798, is applied prior to the start of temperature measurements as a possibility.

Applying Equation (2) to the sunspot numbers from Figure 2 of Reference 1 produces the trend shown in Figure 3 next below. Available measured average global temperatures from Reference 3 are superimposed on the calculated values.

Figure 3: Trend from the sunspot number time-integral plus ocean oscillation using Equation (2) with superimposed available measured data.

Figure 3 shows that temperature anomalies calculated using Equation (2) estimate possible trends since 1610 and actual trends of reported temperatures since they have been accurately measured world wide. The match from 1895 on has R2 = 0.9048 which means that 90.48% of average global temperature anomaly measurements are explained. All factors not explicitly considered must find room in that unexplained 9.52%. Note that a coefficient of determination, R2 = 0.9048 means a correlation coefficient of 0.95.

Calculated anomalies look reasonable back to 1700 but indicate higher temps prior to that than most proxy estimates. They qualitatively agree with Vostok, Antarctica ice core data but decidedly differ from Sargasso Sea estimates during that time (see the graph for the last 1000 years in Reference 6). Credible accurate assessments of average global temperature that far back were not found. Perhaps solar output was a bit lower for a period prior to 1700 which would allow lower average global temperatures in spite of more sunspots. Ocean oscillations might also have been different from assumed.

Other assessments

Other assessments are discussed in Reference 1.

Conclusions

Others that have looked at only amplitude or only time factors for solar cycles got poor correlations with average global temperature. The good correlation comes by combining the two, which is what the time-integral of sunspot numbers does. As shown in Figure 2, the anomaly trend determined using the sunspot number time-integral has experienced substantial change over the recorded period. Prediction of future sunspot numbers more than a decade or so into the future has not yet been confidently done although assessments using planetary synodic periods appear to be relevant 7,8.

If the temperature of the bulk volume of water participating in the ocean oscillation is used in place of the surface temperature of the water, the time-integral of sunspot numbers alone appears to correlate with the estimated true average global temperature trend after approximately 1700.

The net effect of ocean oscillations is to cause the surface temperature trend to oscillate above and below the trend calculated using only the sunspot number time-integral. Equation (2) accounts for both and also, since it matches measurements so well, shows that rational change to the level of atmospheric carbon dioxide can have no significant influence.

A paper published today in Atmospheric Science Letters effectively rules out volcanoes as the main cause of the 'pause' in global warming over the past 17 years. According to the authors, "We deduce a global mean cooling of around −0.02 to −0.03 K over the period 2008–2012. Thus while these eruptions do cause a cooling of the Earth and may therefore contribute to the slow-down in global warming, they do not appear to be the sole or primary cause."

The authors find volcanic eruptions were only responsible for ~0.025C cooling over the 5 year period from 2008-2012, a rate of 0.005C cooling per year. By contrast, the globe was warming at a rate of 0.017C/year from 1979-2000 before the 'pause' in global warming, a rate that was 3.4times higher than the cooling effect of volcanic eruptions calculated in this new paper. In addition, volcanic activity was much greater before the 'pause' than after.

Thus, volcanic activity can effectively be ruled out as an IPCC excuse for no global warming over the past 17 years.

IPCC AR5: ‘There is medium confidence that this difference between models and observations is to a substantial degree caused by unpredictable climate variability, with possible contributions from inadequacies in the solar, volcanic, and aerosol forcings used by the models and, in some models, from too strong a response to increasing greenhouse-gas forcing.’What this means: The IPCC knows the pause is real, but has no idea what is causing it. It could be natural climate variability, the sun, volcanoes – and crucially, that the computers have been allowed to give too much weight to the effect carbon dioxide emissions (greenhouse gases) have on temperature change.

The slow-down in global warming over the last decade has lead to significant debate about whether the causes are of natural or anthropogenic origin. Using an ensemble of HadGEM2-ES coupled climate model simulations we investigate the impact of overlooked modest volcanic eruptions. We deduce a global mean cooling of around −0.02 to −0.03 K over the period 2008–2012. Thus while these eruptions do cause a cooling of the Earth and may therefore contribute to the slow-down in global warming, they do not appear to be the sole or primary cause.

Friday, November 22, 2013

In 2012, the mainstream media breathlessly reported that a brief 4-day surface melt over the Greenland ice sheet represented evidence of man-made global warming and an impending rise in sea levels due to melting and sliding of the Greenland ice sheet. However, a new paper published in PNAS finds the "Greenland ice sheet motion is insensitive to exceptional meltwater forcing." The paper adds to other peer-reviewed publications demonstrating that Greenland is resistant to thaw and sea level rise projections greatly exaggerated.

Nov. 22, 2013 — Predictions of sea level rise could become more accurate, thanks to new insight into how glacier movement is affected by melting ice in summer.

Studies of the Greenland ice sheet, including during a record warm summer, are helping scientists better understand how summer conditions affect its flow. This is important for predicting the future contribution made by melting glaciers to sea level rise.

Ice flows slowly from the centre of the Greenland Ice Sheet towards its margins, where it eventually melts or calves into the ocean as icebergs. Knowing how fast this movement occurs is essential for predicting the contribution of the ice sheet to sea level rise.

In summer, ice from the surface of a glacier melts and drains to the bed of the ice sheet, initially raising water pressure at the base and enabling the glacier to slide more quickly. It can, at times, move more than twice as fast in summer compared with winter, they found.

In 2012, an exceptionally warm summer caused the Greenland Ice Sheet to undergo unprecedented rates of melting. However, researchers have found that fast summer ice flow caused by significant melting is cancelled out by slower motion the following winter.

Scientists found that this is because large drainage channels, formed beneath the ice by the meltwater, helped to lower the water pressure, ultimately reducing the sliding speed.

The discovery suggests that movement in the parts of the ice sheet that terminate on land are insensitive to surface melt rates. It improves scientists' understanding of how the ice sheet behaves and curbs error in estimating its contribution to sea level rise in a warming world.

Scientists led by the University of Edinburgh gathered detailed GPS ice flow data and ice surface melt rates along a 115 km transect in west Greenland and compared ice motion from an average melt year, 2009, with the exceptionally warm year of 2012.

The study, carried out in collaboration with the Universities of Sheffield, Aberdeen, Tasmania and Newcastle, was published in Proceedings of the National Academy of Sciences and supported by the Natural Environment Research Council.

Professor Peter Nienow of the University of Edinburgh's School of GeoSciences, who led the study, said: "Although the record summer melt did not intensify ice motion, warmer summers will still lead to more rapid melting of the ice sheet. Furthermore, it is important that we continue to investigate how glaciers that end in the ocean are responding to climate change."

Edited by Mark H. Thiemens, University of California at San Diego, La Jolla, CA, and approved October 23, 2013 (received for review August 24, 2013)

Significance

During summer, meltwater generated on the Greenland ice sheet surface accesses the ice sheet bed, lubricating basal motion and resulting in periods of faster ice flow. However, the net impact of varying meltwater volumes upon seasonal and annual ice flow, and thus sea level rise, remains unclear. In 2012, despite record ice sheet runoff, including two extreme melt events, ice at a land-terminating margin flowed more slowly than in the average melt year of 2009, due principally to slower winter flow following faster summer flow. Our findings suggest that annual motion of land-terminating margins of the ice sheet, and thus the projected dynamic contribution of these margins to sea level rise, is insensitive to melt volumes commensurate with temperature projections for 2100.

WARSAW, Nov 21 (Reuters) - Governments are shying away from their own warnings that the world has only a fast-shrinking budget of carbon emissions left to use to avoid damaging global warming, frightened off at U.N. climate talks by the radical cuts it would require.

To keep warming to a level that most have defined as manageable, developed nations would have roughly to halve their greenhouse gas emissions by 2030 from 2010 levels, far beyond most governments' plans, said Niklas Hoehne of research group Ecofys.

The global action required, by both rich and poor nations, "is very, very ambitious" compared to current pledges, Hoehne said.

Just two months ago, 110 governments endorsed findings by the U.N.'s Intergovernmental Panel on Climate Change (IPCC) in Stockholm saying that the world had emitted about 515 billion tonnes of carbon since the Industrial Revolution.

The IPCC estimated the total cumulative budget could not exceed a trillion tonnes to allow a good chance of keeping a rise in temperatures to 2 degrees Celsius (3.6 Fahrenheit) in order to avoid ever more frequent and intense heatwaves, floods, droughts and rising sea levels.[false]

Once in the atmosphere, many greenhouse gases remain there for decades or even centuries [false], and on current trends of rising emissions, the trillion tonnes will be reached in a few decades.

But now, at the first meeting since Stockholm working on the outlines of a U.N. deal to slow global warming due to be agreed in 2015, the budget has barely been mentioned.

Myles Allen, of Oxford University, said the implications were just too radical for governments already struggling to cut emissions.

MORE RADICAL MEASURES

Working towards the budget would require a shift towards radical ways to suck carbon dioxide from the air, ensuring that carbon emitted by power plants was captured and stored, and setting out to leave oil and gas in the ground, he said.

But many developed nations are more focused on spurring their sluggish economic growth than on fighting climate change.

And whatever sense of urgency they may have had has been alleviated by the side effects of financial crisis and economic slowdown in the West - which have helped to brake a rise in emissions - and by a probably short-lived brief slowdown in the pace of warming this century.

A related idea by developing nations to ask the IPCC to examine historical responsibility for causing global warming, as a guide to future action in sharing out emissions, is also a minefield at the Warsaw talks.

Developing nations see it as a way to underline the fact that the rich have burnt most fossil fuels since the Industrial Revolution. Rich nations say it would take too long to figure out, and that blame is constantly shifting.

The proposal also raises awkward political questions.

Is Britain, for instance, responsible for India's emissions before independence in 1947? Should heat-trapping methane from rice paddies in China be measured along with Europe's industrial emissions in the 19th century?

Robert Stavins, director of the Harvard Environmental Economics Program, said it would be disastrous to try to apportion historical blame. He also said the IPCC's carbon budget had not been intended as a policy guide.

But these are anyway areas where the Warsaw conference is not going to go.

"The mood here is 'don't tread on anybody's toes'," said Hans Joachim Schellnhuber, founding director of the Potsdam Institute for Climate Impact Research.

"But I am sure that on the Titanic, many people were treading on each other's toes."