Tuesday, January 31, 2012

Another tenet of AGW theory bites the dust in the face of real-world data: AGW theory proposes that increased CO2 levels lead to increased water vapor in the atmosphere (despite empirical data which shows the opposite) and therefore supposedly lead to increased rainfall in most regions. A paper published today in the Nature Climate Change February 2012 edition studied rainfall over the Indian subcontinent 1813-2006 and finds rainfall has decreased since the 1930s as CO2 emissions markedly increased with industrialization. The data instead shows a natural, cyclical variability in mean annual rainfall that peaked in the 1870s and 1930s with absolutely no correlation to levels of CO2.

Moving 30-year average of mean annual rainfall over the Indian subcontinent shows no correlation to CO2 levels

Atmospheric water vapor has declined in direct opposition to AGW theory

Now this is what we mean by a command economy. In a free market economy
market forces --- the hundreds of millions of decisions made on a daily basis by
hundreds of millions of consumers --- determine market winners and losers and
the direction of innovation and business expansion. Do you remember Microsoft
locked in battle with some other computer technology giant for domination of the
market for operating systems and Internet surfing? Frankly, I can’t even
remember the name of Microsoft’s principal rival right now ... they’re pretty
much gone. Welcome Lion OX and Firefox. Where did they come from? Free market
innovation.

Now under Obama’s economic game plan things are completely different. The
government leads and the consumers follow. Obama has decided that the future of
energy is “green,” as they say. Now maybe Obama is motivation to move in this
direction by massive campaign contributions. We do know he isn’t motivated by
market forces ... it’s his intention to replace market forces with political
forces. So ... how’s that working out for us?
Here’s how ....

In just the past week we’ve learned that three so-called “green energy”
companies have either gone out of business or are laying off massive numbers of
employees.

Evergreen Energy: This company received $5.3 million in government
“stimulus.” That’s either your money or money we had to borrow from China.
Evergreen has filed for bankruptcy.

Eneri. They manufacture batteries for electric vehicles. Bankrupt.
Millions more in “stimulus” money.

Amonix, Inc. They make solar panels. $5.9 of your money --- now laying off
two-thirds of its workforce; about 200 people.

And of course you remember Solyndra. About a half-billion shot to hell
there.

That’s command economy for you ... and that’s what you’ll get more of with
Obama in office. Government is so much smarter than the consumers. Let’s see
how that works out for us.

Climate: Global warming alarmists won't give up their
campaign to spread fear and backward thinking until an ice bridge stretches from
New York to Paris. Science, though, says they should.

Al Gore, who invented global warming hysteria, has most recently been found
planning a trip to Antarctica where he will surely find evidence that man is
overheating the planet.

This clearly insecure man who so desperately needs an audience that approves
of his world-saving efforts says he will be taking with him "a large number of
civic and business leaders, activists and concerned citizens from many
countries."

He expects them "to see firsthand and in real time how the climate crisis is
unfolding in Antarctica."

For Gore's reading material on this trip, we suggest he look at some data
released by Great Britain's Met Office. He would find himself meeting head-on a
terribly inconvenient truth.

According to the data, there's been no warming for more than a decade. The
global temperature that Gore and the rest of the alarmist tribe are so concerned
about was about one full degree cooler (as measured in Celsius) last year than
it was when temperatures peaked in 1997.

Of course 2012 could be warmer than 2011 just as 2010 was warmer than 2008
and 2009.

Or it could be cooler. Who knows?

Our space program thinks it does. NASA physicist David Hathaway believes the
next solar period, called Cycle 25, "could be one of the weakest in
centuries."

The Daily Mail, which, unlike America's mainstream media, isn't afraid to
report news that goes against the global warming narrative, says the British
government agrees with that assessment.

The Mail says a Met Office research paper notes that "there is a 92% chance
that both Cycle 25 and those taking place in the following decades will be as
weak as, or weaker than, the 'Dalton minimum' of 1790 to 1830."

But "it is also possible," continues the Mail, "that the new solar energy
slump could be as deep as the 'Maunder minimum,'" which occurred "between 1645
and 1715 in the coldest part of the 'Little Ice Age' when, as well as the Thames
frost fairs, the canals of Holland froze solid."

OK, so frozen Dutch waterways are not the same as an ice bridge linking Fifth
Avenue to Avenue des Champs-Elysees.

But predictions that a man-made global warming catastrophe is imminent look
foolish in light of the data and the solar cycle forecasts.

In fact, they've looked foolish for quite some time.

The warmer temperatures the alarmists were predicting decades ago have never
arrived. Nearly five years back, Kevin Trenberth, a climate scientist who
believes in global warming, had to admit that "none of the climate states in the
models correspond even remotely to the current observed climate."

We have no models, but even without one we think we can safely predict that
the alarmist community and its sphere of influence will continue to shrink.

Thursday, January 26, 2012

The U.S. Department of Agriculture released a new map of the nation's growing zones Wednesday that confirms what many gardeners and farmers already know: Winters are warmer.

The map—widely used as a guide to what areas are suitable for a plant—is divided into zones based on the average lowest winter temperature, in 10-degree increments. Many of the areas that shifted into a warmer zone are in the Northeast, and many were on the edge of a warmer zone in the previous map, released in 1990, said Kim Kaplan, an Agriculture Department spokeswoman.

Large cities in particular are having warmer winters, which the department said may be due to more "heat islands" created by asphalt and concrete. Much of Boston and New York is one zone warmer in the new map.

The Agriculture Department made clear that it doesn't ascribe the trend to climate change. The 30 years of weather data used to create the map weren't sufficient to smooth out weather cycles and determine if there is any underlying climate change, Ms. Kaplan said.

In addition, the agency said the methodology used to build the new map was more sophisticated than that for the 1990 version, so the maps aren't directly comparable. The new map relies on data from 8,000 weather stations and also takes into account topography, prevailing wind, elevation, proximity to large water bodies and other factors not used to create the 1990 map.

"The map is simply not a good instrument to demonstrate [climate change]," Ms. Kaplan said in a news conference.

David Wolfe, a Cornell University professor who studies climate change, said he "would not be so cautious as they were in their statements." In isolation, the new map "doesn't prove climate change" but when combined with other observations, including shifts in animal migration patterns, changes in snow cover and other temperature readings, it "corroborates evidence" of such a change, he said.

In 2003, an updated growing-zone map sponsored by the Agriculture Department similarly showed winters getting warmer. The agency pulled back the map amid a debate about whether it reflected climate change; the agency denied it did. On Wednesday, Ms. Kaplan said the map was rejected because it used outdated methodology and wasn't suited for Web access.

The map, which is consulted by gardeners, plant wholesalers and farmers as well as crop insurers and scientists, is based on the average annual coldest temperature in each of 13 zones—up from 11 zones in the 1990 map due to the addition of several warm-climate zones. The horticulture industry rates plants according to what zone they will survive in over winter.

The map, at www.planthardiness.ars.usda.gov, allows users to find the zone for an area representing as little as half a mile, a much finer gradation than that available on the 1990 map.

Warmer weather already is affecting agriculture. Cotton, traditionally a Southern crop, is moving north into Kansas. The Midwest corn-growing region has expanded north and west into South Dakota and North Dakota, and even into Manitoba, displacing less profitable crops such as wheat. The shift is boosting seed companies such as Monsanto Co. and the Pioneer Hi-Bred unit of DuPont Co. because it allows them to sell more corn seed, their most profitable type.

Tuesday, January 24, 2012

The president is trapped by his own rhetoric amid America's energy boom.

By HOLMAN W. JENKINS, JR.

Barack Obama may believe a lot of things, but he probably doesn't believe the Sierra Club is key to his re-election. His decision to nix the Keystone XL pipeline will cost him votes but he did it anyway.

We'll admit that Mr. Obama's global warming talk has often seemed to us perfunctory. Perhaps we mistook his lack of heat for a lack of conviction. He just released his first 2012 campaign ad and it's a paean to green energy. Maybe he's no less a believer than Al Gore, for all the problems this might seem to pose for what we thought we knew about our president.

For one thing, he's not given to unrealistic goals. He knows China and India are opening a new coal plant every week. He knows the huge amounts of fossil energy lying at humanity's feet won't be abandoned just because an American president says so. He can't fail to notice that Canada's oil sands won't remain undeveloped; the oil will go to the Far East.

Mr. Obama also seems enough of a free thinker to entertain the possibility at least that global warming theory may be wrong. In a telling exchange with interviewer Charlie Rose a few years ago, Al Gore was asked to describe the evidence of man's role in climate change. Each time Mr. Gore recurred to some version of a "consensus of scientists" or "the most respected scientists whose judgment I think is the best."

The truth is, the theory may be popular, but the evidence has thus far eluded the tens of billions spent on climate science. The temperature data are so noisy that they reveal no pattern connecting rising CO2 in the industrial age with temperature trends. Some say because CO2 is a "greenhouse" gas, shut up, case closed. But the known relationship between carbon and climate doesn't actually indicate a big reason to worry.

To produce worrisome scenarios, climate models must posit "feedbacks" that magnify the impact of CO2 by 300% to 500%. A cynic notices that these models became especially popular in the '90s, when measured warming exceeded what could be attributed to CO2, so new fudge was needed to preserve CO2 as the culprit.

Mr. Gore is not smart (no matter what the Nobel committee thinks) whereas Mr. Obama is smart and all these things have likely occurred to him. But he's also a political operator and an acolyte of radical theorist Saul Alinksy. He understands politics as a matter of power, and democratic politics as a matter of powerful coalitions cultivated and maintained with self-interest (aka money, money, money).

Oil, in Mr. Obama's world, is a "Republican" interest group; anything that's good for the oil industry is bad for the alternate power structure he's been trying to build with handouts and mandates for green energy.

Mr. Obama's relationship with global warming may indeed be perfunctory, but he understands the necessity of shibboleths to rationalize and justify the "investments" he's dishing out to manufacture a support base whose need for subsidies and regulatory favors jibes with the Democratic Party's need for donations. Oil sands are the "dirtiest" fossil energy, requiring great releases of CO2. To approve Keystone, then, not only would undermine his side's crucial shibboleths. It would compromise his own credibility as a leader who can be trusted to deny advantage to "Republican" industries and deliver it to "Democratic" ones.

Not for nothing did Canadian Resources Minister Joe Oliver, after Mr. Obama's Keystone decision, gripe about the influence of "billionaire socialists from the United States." Not for nothing did Mr. Obama's own supporters crow about Mr. Obama's ruling as a triumph over the industrialist Koch brothers, an allusion to whom even opens the new Obama campaign spot.

Presidents make traps for themselves: Signature initiatives cannot fail; they can only be doubled down on, as Mr. Obama was expected to do in Tuesday's State of the Union even as he also tried to make peace with the natural-gas fracking boom. Only fresh waves of rhetoric praising electric cars will suffice when taxpayers are figuring out that Obama policy has them subsidizing electric playthings for the affluent. Solyndra must be defended all the more fiercely now that solar is collapsing globally as countries repent of foolish subsidies. Green energy must be hugged to Mr. Obama's breast all the more tightly as the shale revolution renders hopeless any chance of wind and solar becoming cost-competitive with fossil fuels.

Mr. Obama is engaged in a "long game," says Andrew Sullivan, writing in Newsweek, making a point that no one doubted. But there's a difference between playing the long game and playing it well. The Obama long game is exactly how green energy metamorphosed from a policy notion into a political strategy and then into a dead weight his campaign must lug to November.

Still, let us admire the high-rolling political risk Mr. Obama takes in spurning affordable, strategically convenient energy from Canada. That risk includes, between now and Election Day, looking like a chump if oil prices surge because of the world's vulnerability to the narrowness of the Strait of Hormuz.

Monday, January 23, 2012

Federal forecasters are expected to confirm on Monday what the energy industry already knows: Oil production is surging in the U.S.

The U.S. Energy Information Administration is likely to raise by a substantial amount its existing estimate that U.S. oil production will grow by 550,000 barrels per day by 2020, to just over six million barrels daily.

The forecast will include new production data from developing oil fields, including the Bakken shale area in North Dakota, which could hold as much of 4.3 billion barrels of recoverable oil. North Dakota's output of oil and related liquids topped 500,000 barrels per day in November, meaning that the state pumped more oil than Ecuador. In fact, U.S. oil production grew faster than in any other country over the last three years and will continue to surge as drillers move away from natural gas due to a growing gas glut, experts say. The glut has sent natural-gas prices to a 10-year low.

The combination of techniques that fueled the recent rise in natural-gas production—horizontal drilling and hydraulic fracturing, or "fracking"—has been expanded to U.S. oil fields.

This rising tide of oil and related liquids such as condensate that also are used as fuel could reduce U.S. dependence on oil imports and help ease the country's trade deficit. But it may have limited impact on U.S. gasoline prices, which increasingly are set by global supply-and-demand trends.

The increased domestic production also isn't enough to help the U.S. achieve the elusive ideal of energy independence—the country is expected to consume more than 19 million barrels of oil and liquids a day by 2020.

From 2008 through 2011, U.S. production of a broader category of oil and related liquids grew by 1.3 million barrels per day, or more than 17 percent, to 8.9 million barrels, according to the research firm IHS-CERA. That outpaced Russia, which saw production grow about 480,000 barrels per day; China, where it grew about 380,000 barrels per day; and Brazil, where output was up by more than 340,000 barrels daily.

IHS-CERA predicts that U.S. production could grow by another 1.3 million barrels per day by 2020, to 10.2 million barrels.

"I don't think it's widely appreciated how dramatic it's been," Jim Burkhard, managing director of IHS CERA's Global Oil Group, said of U.S. growth. "Deep-water production has contributed to the growth in recent years, and more biofuels has helped, but the really dramatic improvement has been in onshore oil and liquids—and that is what will continue to drive growth in coming years."

The surge is big reversal from just a few years ago. U.S. production of oil and other liquids peaked at 11.3 million barrels a day in 1970 and began to decline. The decline bottomed out at 7.6 million barrels a day in 2008 as the new drilling techniques emerged.

Thursday, January 19, 2012

This story appeared in the January 16, 2012 issue of Forbes Magazine.
Photos by Chris Leschinsky/Getty Images for Forbes.

By Todd
WoodyDrive out of California’s smoggy San Joaquin Valley, past
the oil rigs planted helter-skelter in citrus groves, climb into the Tehachapi
Mountains, and the future suddenly comes into view. Hundreds of gleaming white
wind turbines generating carbon-free electricity carpet chaparral-covered ridges
and march down into the valleys of Joshua trees that lead to the Mojave
Desert.

Here in Kern County, a bastion of Big Oil and Big Agriculture, green energy
has become big business. In the past 36 months the wind industry has attracted
$3.2 billion in investment to a region with an unemployment rate 64% higher than
the U.S. average. A multibillion-dollar transmission line under construction in
the Tehachapi will carry as much as 4,500 megawatts of renewable energy, most of
it from wind farms, to coastal cities. At peak output that’s the equivalent of
four or five big nuclear power plants and a linchpin of California’s mandate to
­obtain a third of its electricity from renewable sources by 2020. With a
crucial federal tax credit set to expire at the end of 2012, developers are
racing to put steel into the ground and secure a spot on the wire.

“The hotels are now full, the people who work in the restaurants now have
someone to wait on,” says Lorelei Oviatt, Kern County’s planning director in Bakersfield, the
honky-tonk hometown of Buck Owens and Merle Haggard. “If you were laying
concrete for a house, now you’re laying concrete for a turbine.”

A shadow, however, is falling on the Tehachapi, cast by the
nine-and-a-half-foot wingspan of a Pleistocene-born bird of uncommon
intelligence and longevity. With the investment of tens of millions of dollars
and extraordinary effort by scientists, North America’s largest bird, the
California condor, is staging a spectacular comeback after verging on extinction
25 years ago. The 200 birds in the wild today (out of 400 total) are rapidly
reinhabiting their historic range in one of the nation’s great achievements of
conservation biology. Naturalists can once again marvel at a bird that
manipulates hot winds to soar hundreds of miles without flapping its wings.

It’s a flight path that is taking the condor perilously closer to the
spinning blades of Tehachapi wind turbines that depend on those same thermal
currents to generate power; biologists fear it’s only a matter of time before
the condor begins hitting the 500-foot-high machines. A single death could be
catastrophic for the wind industry, the regional economy and, not least, the
condor. The loss of an alpha bird could disrupt breeding patterns and an
intricate avian hierarchy, according to biologists. “It would be a major
disaster,” says Mark Tholke, an executive with wind developer enXco, which is
building several projects in the Tehachapi.

Under the federal and California ­endangered species acts, it’s illegal for
anyone to kill a condor without first securing a permit to do so. Given that the
government has not issued such an “incidental take” permit and has no intention
of doing so, if a turbine kills a condor, the operator could be charged
criminally. Environmentalists could also ask a judge to shut down a wind farm
where a condor died. “If we as an industry don’t come up with a plan that is
clear and reliable,” says Tholke, “the uncertainty is going to drive some
investors away and drive up the cost of renewable energy.”

Already, state regulators have scuttled a huge Pacific Gas & Electric
wind project in part because of the financial risks of a potential condor-caused
cut to electricity production. Last June the Tehachapi’s biggest developer,
Terra-Gen Power, abruptly pulled a planned 411-megawatt farm after Oviatt says
she told executives that condor concerns and opposition from local residents
would likely doom the project. Then in October the Sierra Club and two other
environmental groups sued Kern County over its approval of a 300-megawatt NextEra Energy
Resources wind farm that state and federal officials warn poses a high risk to
condors.

Wednesday, January 18, 2012

A paper published last week in the Journal of Climate attributes warming of the North Atlantic ocean surface temperatures in the mid-1990s to a prolonged positive phase of the North Atlantic Oscillation (NAO), a natural ocean oscillation with no relation to 'greenhouse gas' concentrations.

In the mid-1990s the subpolar gyre of the North Atlantic underwent a remarkable rapid warming, with sea surface temperatures increasing by around 1C in just 2 years. This rapid warming followed a prolonged positive phase of the North Atlantic Oscillation (NAO), but also coincided with an unusually negative NAO index in the winter of 1995/96. By comparing ocean analyses and carefully designed model experiments we show that this rapid warming can be understood as a delayed response to the prolonged positive phase of the NAO, and not simply an instantaneous response to the negative NAO index of 1995/96. Furthermore, we infer that the warming was partly caused by a surge, and subsequent decline, in the Meridional Overturning Circulation and northward heat transport of the Atlantic Ocean. Our results provide persuasive evidence of significant oceanic memory on multi-annual timescales, and are therefore encouraging for the prospects of developing skillful predictions.

Nearly four decades ago, British economist E.F. Schumacher stated the essence of environmental protection in three words: Small is beautiful. As Schumacher argued in his famous book by that title, man-made disturbances of the natural world—farms, for example, and power plants—should have the smallest possible footprints.
But how can that ideal be realized in a world that must produce more and more food and energy for its growing population?

The answer, in just one word, is density.
Over the course of the last century, human beings have found ways to concentrate crops and energy production within smaller and smaller areas, conserving land while meeting the ever-growing global demand for calories and watts. But this approach runs counter to the entrenched beliefs of many environmental activists and politicians, whose "organic" and "renewable" policies, as nature-friendly as they sound, squander land and other resources.
Food cultivation exemplifies the virtues of density. During the second half of the 20th century, hybrid seeds and synthetic fertilizers, along with better methods of planting and harvesting, produced stunning increases in agricultural productivity. Between the mid-1960s and mid-2000s, global production of all cereal crops doubled, according to U.N. data, even though the amount of cultivated acreage remained about the same.
Indur Goklany, a policy analyst for the U.S. Department of the Interior, estimates that if agriculture had remained at its early 1960s level of productivity, feeding the world's population in 1998 would have required nearly eight billion acres of farmland, instead of the 3.7 billion acres that were actually under cultivation. Where in the world—literally—would we have found an extra 4.3 billion acres, an area slightly smaller than South America?

Meanwhile, a recent analysis of U.S. Department of Agriculture data, by plant pathologist Steve Savage, found that land devoted to organic farming produces about 29% less corn and 38% less winter wheat than the same acreage conventionally farmed. Since world population is growing and food prices are already at near-record highs, mandates for organic farming could be disastrous. For example, low-density agriculture could increase deforestation as farmers desperately seek more farmland—a result that should disturb environmentalists.

Now consider biofuels, which are supposed to reduce carbon-dioxide emissions. The domestic biofuel craze began in 1976, when Amory Lovins, co-founder of the Rocky Mountain Institute and a darling of the greens, declared that "developments in the conversion of agricultural, forestry and urban wastes to methanol and other liquid and gaseous fuels now offer practical, economically interesting technologies sufficient to run an efficient U.S. transport sector."
Today, Mr. Lovins still promotes this mirage—and unfortunately so do many others, including Secretary of Energy Steven Chu. But a bit of elementary math shows that large-scale biofuels production is a fool's errand.
Assume you wanted to replace one-tenth of U.S. oil consumption with fuel derived from switch grass, a plant often mentioned during discussions of cellulosic ethanol. That would require cultivating some 37 million acres of land—an area roughly the size of Illinois—in nothing but switch grass.

The problem with biofuels is low power density, a term that refers to the amount of energy flow that can be harnessed from a given area, volume or mass. The power density of plants such as corn or switch grass is fractions of a watt per square meter. Some energy analysts estimate the power density of corn ethanol to be as low as 0.05 watts per square meter of farmland. By comparison, a relatively small natural-gas well that produces just 60,000 cubic feet of gas per day has a power density of 28 watts per square meter.
Wind turbines have a power density of about one watt per square meter. Compare that with the two nuclear reactors at Indian Point, which provide as much as 30% of New York City's electricity. Even if you include the entire footprint of the Indian Point project—about 250 acres—the site's power density exceeds 2,000 watts per square meter. To generate as much electricity as Indian Point does, you'd need to cover about 770 square miles of land with wind turbines, an area slightly smaller than Rhode Island.

The virtues of density can also be seen in nuclear waste, a leading bugaboo of groups like Greenpeace and the Sierra Club. According to the Nuclear Energy Institute, an industry group, the American commercial nuclear-power industry, over its entire history, has produced about 62,000 tons of high-level waste. Stacked to a depth of about 20 feet, that would cover a single football field. Coal-fired power plants in the United States, by contrast, generate about 130 million tons of coal ash in a single year.
True, radioactive waste is toxic and long-lived, but it can be stored safely. France produces about 80% of its electricity from nuclear fission, and all of its high-level waste is stored in a single building about the size of a soccer field.
The greenness of density leads to two conclusions. First, those who make environmental policy should consider density a desirable goal in nearly all the issues that they confront. And second, the real environmentalists aren't the headline-seeking advocacy groups. They're the farmers, urban planners, agronomists—and yes, even natural-gas drillers and nuclear engineers.

Mr. Bryce is a senior fellow at the Manhattan Institute. This article is adapted from the Winter 2012 issue of City Journal.

Thursday, January 12, 2012

A paper published today in Geophysical Research Letters finds no significant change in Antarctic snowmelt over the entire 31 year period of satellite observations 1979-2010. The paper actually shows a declining trend in snowmelt over the past 31 years, although not statistically significant. Of note, the abstract states, "other than atmospheric processes likely determine long-term ice shelf stability." Translation: increased CO2 and other 'greenhouse gases' do not threaten stability of the Antarctic ice shelf.

Meltwater volume for the Antarctic continent (top graph) shows a declining (statistically insignificant) trend since satellite observations began in 1979.

Surface snowmelt is widespread in coastal Antarctica. Satellite-based microwave sensors have been observing melt area and duration for over three decades. However, these observations do not reveal the total volume of meltwater produced on the ice sheet. Here we present an Antarctic melt volume climatology for the period 1979–2010, obtained using a regional climate model equipped with realistic snow physics. We find that mean continent-wide meltwater volume (1979–2010) amounts to 89 Gt y−1 with large interannual variability (σ = 41 Gt y−1). Of this amount, 57 Gt y−1 (64%) is produced on the floating ice shelves extending from the grounded ice sheet, and 71 Gt y−1 in West-Antarctica, including the Antarctic Peninsula. We find no statistically significant trend in either continent-wide or regional meltwater volume for the 31-year period 1979–2010.

Monday, January 9, 2012

A paper published today in Geophysical Research Letters predicts less 21st century 'greenhouse' warming than the IPCC [transient climate response of 1.3-1.8C with a midpoint of 1.5C vs. IPCC's 1-3C with a midpoint of 2C].

Estimates of TCR and 21st century warming are sensitive to the analysis period

Using 1851-2010 observations gives lower and less uncertain projected warming

The influence of GHGs, aerosols and natural forcings on temperature is detected

N. P. Gillett

Canadian Centre for Climate Modelling and Analysis, Environment Canada,, Victoria, British Columbia,, Canada

V. K. Arora

Canadian Centre for Climate Modelling and Analysis, Environment Canada,, Victoria, British Columbia,, Canada

G. M. Flato

Canadian Centre for Climate Modelling and Analysis, Environment Canada,, Victoria, British Columbia,, Canada

J. F. Scinocca

Canadian Centre for Climate Modelling and Analysis, Environment Canada,, Victoria, British Columbia,, Canada

K. von Salzen

Canadian Centre for Climate Modelling and Analysis, Environment Canada,, Victoria, British Columbia,, Canada

Projections of 21st century warming may be derived by using regression-based methods to scale a model's projected warming up or down according to whether it under- or over-predicts the response to anthropogenic forcings over the historical period. Here we apply such a method using near surface air temperature observations over the 1851–2010 period, historical simulations of the response to changing greenhouse gases, aerosols and natural forcings, and simulations of future climate change under the Representative Concentration Pathways from the second generation Canadian Earth System Model (CanESM2). Consistent with previous studies, we detect the influence of greenhouse gases, aerosols and natural forcings in the observed temperature record. Our estimate of greenhouse-gas-attributable warming is lower than that derived using only 1900–1999 observations.Our analysis also leads to a relatively low and tightly-constrained estimate of Transient Climate Response of 1.3–1.8°C, and relatively low projections of 21st-century warming under the Representative Concentration Pathways. Repeating our attribution analysis with a second model (CNRM-CM5) gives consistent results, albeit with somewhat larger uncertainties.

Friday, January 6, 2012

A paper published today in the Journal of Geophysical Research analyzes sea level change from 10 New Zealand tide gauges and finds the sea level rise over the past 50 years to be only 1.7 mm/yr [i.e. less than 7 inches per century] with no evidence of acceleration. [Figure 3 on abstract page shows no evidence of acceleration]

New analysis of decadal and interdecadal sea level variability in New Zealand

New data on the spatial variability of sea level change in New Zealand

John Hannah

School of Surveying, University of Otago,, Dunedin,, New Zealand

Robert G. Bell

National Institute of Water and Atmospheric Research,, Hamilton,, New Zealand

In terms of sea level data sets able to be used for long-term sea level trend analysis, the Southern Hemisphere is a data sparse region of the world. New Zealand lies in this region, presently having four (major port) data sets used for such trend analysis. This paper describes the process followed to compute new sea level trends at another six ports, each with very discontinuous tide gauge records. In each case the tide gauge has previously only been used for precisely defining an historical local Mean Sea Level (MSL) datum. The process used involved a comparison of the old MSL datum with a newly defined datum obtained from sea level data covering the last decade. A simple linear trend was fitted between the two data points. Efforts were then made to assess possible bias in the results due to oceanographic factors such as the El Niño–Southern Oscillation (ENSO) cycle, and the Interdecadal Pacific Oscillation (IPO). This was done by taking the longer time series from the four major ports and assessing the spatially coherent variability in annual sea level using the dominant principal component from an empirical orthogonal function (EOF) analysis. The average relative sea level rise calculated from these six newly derived trends was 1.7 ± 0.1 mm yr−1, a result that is completely consistent with the analysis of the long-term gauge records. Most importantly, it offers a relatively simple method of improving our knowledge of relative sea level trends in data sparse regions of the world.