Thursday, May 31, 2012

A paper published today in The Holocene finds that the elevation of the treeline in the Rocky Mountains of Wyoming was higher than the present from ~9000 to 6000 years ago, indicating the climate was warmer during that period as compared to present temperatures. The paper adds to thousands of others indicating that the current warming period is not unprecedented nor unusual compared to natural warming in the past.

Abstract

Future climate projections predict warming at high elevations that will impact treeline species, but complex topographic relief in mountains complicates ecologic response, and we have a limited number of long-term studies examining vegetation change related to climate. In this study, pollen and conifer stomata were analyzed from a 2.3 m sediment core extending to 15,330 cal. yr BP recovered from a treeline lake in the Rocky Mountains of Wyoming. Both pollen and stomata record a sequence of vegetation and climate change similar in most respects to other regional studies, with sagebrush steppe and lowered treeline during the Late Pleistocene, rapid upward movement of treeline beginning about 11,500 cal. yr BP, treeline above modern between ~9000 and 6000 cal. yr BP, and then moving downslope ~5000 cal. yr BP, reaching modern limits by ~3000 cal. yr BP. Between 6000 and 5000 cal. yr BP sediments become increasingly organic and sedimentation rates increase. We interpret this as evidence for lower lake levels during an extended dry period with warmer summer temperatures and treeline advance. The complex topography of the Rocky Mountains makes it challenging to identify regional patterns associated with short term climatic variability, but our results contribute to gaining a better understanding of past ecologic responses at high elevation sites.

Wednesday, May 30, 2012

In his book Chill, environmentalist and scientist Peter Taylor outlines why the claimed scientific consensus over the causes and effects of climate change is not what it appears. Taylor outlines in detail why computer models of climate are critically flawed and fail to account for natural variation due to solar activity, clouds, atmospheric cycles, and ocean oscillations. Taylor shows that the alleged contribution of increasing carbon dioxide levels to the warming of the last century is comparatively small relative to the calculated warming due to decreased cloud cover and increase in solar activity over the same period. In fact, solar effects modulated by cloud cover changes alone can account for all of the observed warming, without any alleged influence of greenhouse gases.

Synopsis from Amazon:

Although the world's climate has undergone many cyclical changes, the phrase 'climate change' has taken on a sinister meaning, implying catastrophe for humanity, ecology and the environment. We are told that we are responsible for this threat, and that we should act immediately to prevent it. But the apparent scientific consensus over the causes and effects of climate change is not what it appears. "Chill" is a critical survey of the subject by a committed environmentalist and scientist. Based on extensive research, it reveals a disturbing collusion of interests responsible for creating a distorted understanding of changes in global climate. Scientific institutions, basing their work on critically flawed computer simulations and models, have gained influence and funding. In return they have allowed themselves to be directed by the needs of politicians and lobbyists for simple answers, slogans and targets. The resulting policy - a 60 percent reduction of greenhouse-gas emissions by 2050 - would have a huge, almost unimaginable, impact upon landscape, community and biodiversity. On the basis of his studies of satellite data, cloud cover, ocean and solar cycles, Peter Taylor concludes that the main driver of recent global warming has been an unprecedented combination of natural events. He urges a shift away from mistaken policies that attempt to avert inevitable natural changes, to an adaptation to a climate that may turn significantly cooler.

The main elements of Taylor's alternative explanation of the causes of climate change are as follows:

§ Variations in the sun’s output of visible light and UV appear capable of producing significant trends as witnessed by the Medieval Warming Period, from 800 to 1300, the Little Ice Age from 1400 to 1700, the recent warming from the 1800’s and the leveling off and cooling from 2000. (Carbon dioxide levels by comparison fail to correlate with any temperature change except for the period from 1980 to 2000)

§ Variations in the sun’s UV light output have recently been shown to vary significantly over the 11-year sunspot cycle and have implications for the polar vortex and the jet stream.

§ Variations of the solar spectrum appear to have the capacity to affect cloud cover.

§ The pulsed nature of solar energy can be seen to affect ocean surface temperatures over the 11-year cycle, and possibly also over the longer cycles.

§ The oceans are subject to well-documented oscillations, chief of which are the ENSO (El Niño-Southern Oscillation), the North Atlantic Oscillation, the Arctic Oscillation and the Pacific Decadal Oscillation, which are correlated to solar cycles.

§ The greater part of the sun’s energy input (as UV and visible light) is received and stored in the tropical oceans and then moved by the aforementioned currents and atmospheric processes to the poles.

§ Warming on land is strongly related to the transfer of heat by wind and rainfall from the ocean.

§ Since 1950, the Pacific Decadal Oscillation phases coincided with the 1945 to 1978 global cooling and the 1978 to 2000 warming.

§ The Arctic rapid ice loss between 2000 and 2007 can be explained by the return of the Arctic oscillation peak warm phase and is comparable with the Arctic warming of 1940, as part of the 60 to 70 year cycle. (There is evidence that the Arctic ice has melted several times before.) This is also associated with increased Arctic cloud cover, which, at the poles acts as an insulator, further increasing ice melt. (Low-level cloud cover in the tropical and sub-tropical regions generally acts in the opposite manner, causing cooling.)

§ There is strong evidence that the major part of the 1980 to 2000 warming was caused by cloud thinning and increased UV and visible light to the ocean and land surface. The cloud patterns show evidence of phase changes associated with ocean oscillations as well as peaks and troughs of the solar cycle (possibly mediated by cosmic rays which seed cloud formation.)

For the e-car to be more than a plaything for the rich, it has to succeed in the mass market.

The Chevy Volt was crowned European Car of the Year for 2012 on March 5. But the celebration was muted by GM's decision—three days earlier—to halt the Volt production line due to a lack of consumer demand. Nissan's Leaf, which won the accolade in 2011, has also fared poorly in the market. Great cars, terrible sales—here is your e-car paradox.

The electric car has become a political litmus test. With billions of federal dollars spent to motivate further billions in private investment, the financial stakes are high, as are the environmental and energy-security implications. But in the tug of war between the caricatures of lefty liberal greens and righty conservative petroheads, love-it or hate-it are the only two options. Where does this leave the supportive, but objective, centrist?

Answer: dismayed with current efforts and desperate for broader solutions. For the electric car to be anything more than a plaything for rich environmentalists—and have any impact on energy security or the environment—it has to succeed in the mass market.

Unfortunately, manufacturers are approaching the electric car as another new product when what they really need is a new business model. The problem is not the cars themselves (which are technology marvels), or even the availability of charging infrastructure (which is improving thanks to government largess). Unless two neglected factors—battery depreciation and power-grid management—are addressed, costly efforts at improving e-cars and charge spots will fail in the mass market.

First, while much attention has been paid to the appeal of electric cars to new buyers, an equally important question will be their appeal to used-car buyers. Here the major determinant is battery technology. Electric car batteries are extremely expensive—they can account for one-third of the cost of the vehicle—and offer limited driving range. A steady stream of expected advances means they will improve every year. This is good news, but only for those who haven't yet purchased an electric vehicle.

With electric cars, the most expensive part of the car is also the one that depreciates fastest. The usable life span of batteries (their ability to hold a charge degrades with use) and the pace of technology improvement (lower cost and greater range of newer batteries) reduce the relative value of used batteries and depress the resale proposition for used electric cars.

While some car buyers may give little thought to resale value, or plan on refurbishing their cars for many years, for mainstream buyers resale value is crucial: Lousy resale value means goodbye to the mainstream market. (Note to Tesla owners: You are not the mass market.)

Second, while intense public attention and investment have been focused on rolling out plug-in charge spots, these efforts have been decoupled from investment in the smart-grid technology needed to assure that power generation and distribution can actually support mass charging. As long as only a handful of drivers plug in each morning, the current grid will hold. But if 5% of cars in Los Angeles County were to plug in simultaneously, they could place a 750-megawatt load on California's already strained grid, equivalent to the generating capacity of two midsize power plants. Unless the power infrastructure challenge is addressed in advance, the very success of the electric car will drive its failure.

Solving the e-car puzzle won't happen unless private firms and governments consider the bigger picture when strategizing their investments. An intriguing example is the approach of Better Place LLC in Israel and Denmark, which applies the familiar cellphone model to its electric cars, selling multiyear contracts based on miles of driving (analogous to minutes) to finance their infrastructure investments and battery depreciation.

The Better Place approach is but one alternative. One thing is clear: The success of electric cars hinges on the successful alignment of the entire electric-car ecosystem. Uncoordinated investment in the individual pieces is a recipe for failure.

Mr. Adner is a professor of strategy at Dartmouth's Tuck School of Business and the author of "The Wide Lens: A New Strategy for Innovation" (Portfolio/Penguin, 2012).

The green lobby picks its next fossil fuel target. WSJ.COM 5/29/12

The media are finally catching up to America's shale natural gas boom, with even Fortune magazine waddling in with a cover story. But the bigger recent news is that one of the most powerful environmental lobbies, the Sierra Club, is mounting a major campaign to kill the industry.

The battle plan is called "Beyond Natural Gas," and Sierra Club executive director Michael Brune announced the goal in an interview with the National Journal this month: "We're going to be preventing new gas plants from being built wherever we can." The big green lobbying machine has rolled out a new website that says "The natural gas industry is dirty, dangerous and running amok" and that "The closer we look at natural gas, the dirtier it appears; and the less of it we burn, the better off we will be." So the goal is to shut the industry down, not merely to impose higher safety standards.

This is no idle threat. The Sierra Club has deep pockets funded by liberal foundations and knows how to work the media and politicians. The lobby helped to block new nuclear plants for more than 30 years, it has kept much of the U.S. off-limits to oil drilling, and its "Beyond Coal" campaign has all but shut down new coal plants. One of its priorities now will be to make shale gas drilling anathema within the Democratic Party.

A Consol Energy Horizontal Gas Drilling Rig explores the Marcellus Shale outside the town of Waynesburg, Penn., on April 13.

The political irony is that not too long ago the Sierra Club and other greens portrayed natural gas as the good fossil fuel. The Sierra Club liked natural gas so much (and vice versa) that from 2007-2010 the group received $26 million in donations from Chesapeake Energy and others in the gas industry, according to an analysis by the Washington Post. Some of that money was for the Beyond Coal campaign.

One reason for this once-mutual affection is that natural gas produces much less carbon emissions than does coal—and the Sierra Club claims to want fewer such emissions.

The federal Energy Information Administration reports that in 2009 "the 4% drop in the carbon intensity of the electric power sector, the largest in recent times, reflects a large increase in the use of lower-carbon natural gas because of an almost 50% decline in its price." The Department of Energy reports that natural gas electric plants produce 45% less carbon than coal plants, though newer coal plants are much cleaner.

Researchers at Harvard's School of Engineering and Applied Sciences found that electric power plants reduced their greenhouse gases by 8.76% in 2009 alone. Most of the carbon reduction was driven not by mandates or regulation but by the economics of lower gas prices. The lead researcher, professor Michael McElroy, says: "Generating one kilowatt-hour of electricity from coal releases twice as much CO2 to the atmosphere as generating the same amount from natural gas, so a slight shift in the relative price of coal and natural gas can result in a sharp drop in carbon emissions."

Even the liberal Union of Concerned Scientists admits benefits from burning natural gas, finding that the resulting drop in emissions from sulfur, mercury and nitrogen oxides "translate into public health benefits, as these pollutions have been linked with problems such as asthma, bronchitis, lung cancer and heart disease for hundreds of thousands of Americans."

So why is the Sierra Club suddenly portraying natural gas as a villain? The answer surely is the industry's drilling success. The greens were happy to support natural gas as a "bridge fuel to the 21st century" when it cost $8 or more per million BTUs and seemed to be in limited domestic supply.

But now that the hydraulic fracturing and shale revolution has sent gas prices down to $2.50, the lobby fears natural gas will come to dominate U.S. energy production. At that price, the Sierra Club's Valhalla of wind, solar and biofuel power may never be competitive. So the green left has decided it must do everything it can to reduce the supply of gas and keep its price as high as possible.

The losers if this effort succeeds would be the millions of Americans who are benefitting from the shale boom. Some 600,000 jobs in the natural gas industry could be vulnerable, according to an analysis by the consulting firm IHS Global Insight. That's almost eight times more jobs than are employed by the wind industry.

But the losers would also include electricity consumers paying lower prices at home; the steel workers in Youngstown, Ohio who have been rehired to make pipe for gas drillers in the Marcellus Shale; and the thousands of high-paying jobs in chemicals, fertilizer and other manufacturing that is returning to the U.S. because natural gas prices are so much lower.

The Sierra Club campaign underscores that the modern green agenda is about far more than clean air and water and protecting wildlife. The real goal is to ban all fossil fuels—regardless of economic cost. It's hard to imagine a campaign that poses a greater threat to the U.S. economy, energy security and American health.

Tuesday, May 29, 2012

A paper from a paleoclimatology workshop finds that the southern dome of Greenland did not melt away during the extreme natural climate change of the "Eemian interglacial (125,000 years ago), when annual mean temperatures over Greenland were [about] 5°C warmer than now for some millenia [thousands of years]." The author asks, "will [the southern dome of Greenland] melt away for the first time in 400,000 years?" and concludes, "Probably not." The IPCC claims [non-existent] positive feedback from water vapor could lead to 3°C warming from doubled CO2 levels, but lessons from the geological past show that even if the globe warmed 2°C more to 5°C warmer than the present for thousands of years, neither the northern nor southern domes of Greenland would melt away.

Abstract: Recent years’ rapid melting of the Greenland ice sheet has shown that the southern dome is the ice sheet’s most vulnerable part. The southern ice sheet dome, the area south of c. 67°N, is a highland ice cap with its base c. 500 m a.s.l. It contains c. 15% of the Greenland ice sheet’s volume, equal to c. 1 m global sea level, and is characterised by very high accumulation and melting. Two of the most active outlets from the ice sheet, Jakobshavn Isbræ and Helheim Gletscher drain the saddle between the northern and southern ice sheet domes.

Can the southern dome’s response to past warming give us a clue to its fate in the future? ODP borings on the shelf have shown that the ice dome has existed, on and off, at least since the Miocene. Recent results from the DYE 3 ice core and other sources indicate that the dome melted away, and gave way to forested mountains for the last time during marine isotope stage 11, c. 400,000 years ago.The southern dome, and of course the northern also, persisted in a reduced form during the warm Eemian interglacial (c. 125,000 years ago), when annual mean temperatures over Greenland were c. 5°C warmer than now for some millenia. During the last ice age the southeast coast of Greenland was one of the areas of major ice sheet growth, reaching the shelf edge at the last glacial maximum, c. 20,000 years ago, as shown by bathymetric studies. During the Holocene thermal maximum, c. 8,000 years ago, when annual mean temperatures were c. 2°C warmer than now for some thousands of years, modelling and GPS altimetry show that the southern dome was the most sensitive part of the ice sheet, retreating as much as 80 km behind its present front in some areas. After this, during the neoglacial the ice margin readvanced. In spite of the large scale changes in ice cover in this area, the Holocene isostatic history is peculiarly muted and characterised by low uplift. This can be interpreted in several ways, but does show an abnormal ice load history, when compared to other sectors of the ice sheet.

In general, the variable behaviour of the southern dome through the geological record contrasts with that of the much more resilient northern dome. Judged from this, we can expect a more direct and vigorous response to warming in the southern dome than in the much larger northern dome, but will it melt away for the first time in 400,000 years? – Probably not.

Data centers now consume about 1.3% of all global electricity.

Before Facebook's recent initial public offering, the media obsessed over superlatives. It was the largest-ever IPO for a U.S. technology company. It was the third-largest in U.S. history. And now the obsession is over the company's lackluster revenue prospects and possible misconduct by investment bankers involved in the offering.

Missing here is any awareness of the enormous quantities of electricity Facebook and other data-intensive technology companies require. Those requirements expose a fundamental mismatch between the high-power-density world of Big Data and the low-density electricity production inherent in most renewable energy projects.

In documents filed with the Securities and Exchange Commission on Feb. 1, Facebook said that it stores more than 100 petabytes of information. (That's 100 million gigabytes.) Facebook spreads that gargantuan quantity of data among a handful of warehouse-size data centers filled with servers located in Virginia, California and Oregon. The company's new data center is a 300,000 square-foot facility in Prineville, Ore., that draws 28 megawatts, enough power for about 28,000 homes.

That's not unusual. The power needed by data centers has been a hot topic for more than a decade as local electricity grids have been forced to adapt to huge new loads. Google alone reports that it operates 11 data centers in six states and five foreign countries that require some 260 megawatts of power, enough for 260,000 homes.

Getty Images

As more computing moves into the "cloud"—the network of data centers that deliver information and software to our mobile devices and computers—electricity use is soaring. Data centers now consume about 1.3% of all global electricity. That amount of energy, about 277 terawatt-hours per year, exceeds the electricity use of dozens of countries, including Australia and Mexico.

And that quantity of energy will continue to grow. Intel expects the number of devices connected to the Internet—ranging from smartphones to GPS-enabled locaters on shipping containers—to grow to 15 billion by 2015 from 2.5 billion today.

Last month, Greenpeace issued a report called "How Clean is Your Cloud?" The environmental group graded a series of technology companies, including Facebook, Apple, Dell, Amazon and others, on the percentage of what it calls "dirty energy" used by their data centers. Greenpeace—which, of course, has a Facebook page—gave the social-media company a "D" for what it calls "energy transparency." It is also claiming to have convinced Facebook to "unfriend" coal-fired electricity.

Never mind that 40% of all global electricity production comes from coal. Let's consider what the "clean energy" footprint of one of these big data centers might look like.

Apple has touted its plan to use solar energy to help run its massive new data center in Maiden, N.C. But in a recent blog post (perspectives.mvdirona.com) titled "I Love Solar Power But," James Hamilton, a vice president and engineer on Amazon's Web services team, calculated that the 500,000 square-foot facility would need about 6.5 square miles of solar panels.

He noted that setting aside that kind of space in densely populated regions, where many data centers are built, is "ridiculous" and would be particularly difficult because the land couldn't have any trees or structures that could cast shadows on the panels.

Wind? An average wind-energy project has an electricity-generating capacity of about two watts per square meter. Even assuming that a wind project produces electricity 100% of the time (it won't), Facebook's data center in Prineville would need a wind project covering about 14 million square meters, nearly 5.5 square miles, or about four times the size of New York City's Central Park.

The mismatch between the power demands of Big Data and the renewable-energy darlings of the moment are obvious. U.S. data centers are now consuming about 86 terawatt-hours of electricity per year, or about 43 times as much electricity as is produced by all the solar-energy projects in America.

"Clean energy" is a great friend for Facebook, Apple and every other energy consumer in America—as long as those consumers don't use much energy at all.

Mr. Bryce is a senior fellow at the Manhattan Institute. His latest book is "Power Hungry: The Myths of 'Green' Energy and the Real Fuels of the Future" (Public Affairs, 2010).

Friday, May 25, 2012

The Obama green energy economic boom appears to be in full swing, with US solar panel manufacturers capturing a whopping 3% of world market share in 2011. To allegedly protect these vital interests and taxpayer beneficiaries, the administration has slapped a stiff 31% tariff on Chinese solar panels, starting a trade war that now places at risk a $3 billion market annually for US companies that supply and license Chinese manufacturers.

Beijing mounted a spirited response to U.S. efforts to impose tariffs on China's solar-panel industry, with officials accusing Washington of illegally helping its domestic industry and Chinese solar companies teaming up to fight the levies.

China's Commerce Ministry said in a brief statement on Thursday that its investigations of six clean-energy projects in five U.S. states had uncovered violations of international trade law. The announcement, which described the findings as preliminary, didn't offer details or any potential Chinese response.

The Chinese action followed the White House's decision last week to impose 31% tariffs on some solar panels produced in China, alleging they had been sold below cost, or "dumped." Beijing had already termed the tariffs "unjustified" and a reflection of "the U.S.'s tendency toward trade protectionism."

The chief executives of four major Chinese solar-power equipment producers said at a Shanghai news conference Thursday that they had allied to fight Washington's allegations, saying the Chinese industry is beneficial to the U.S. The alliance said U.S. companies are major suppliers to the Chinese industry and that U.S. consumers benefit from the lower prices that result from the industry's concentration and competitiveness.

U.S. trade challenges have prompted Chinese manufacturers to announce tens of millions of dollars in write-downs and sparked worries that the row could increase the chances of a trade war.

The U.S. and China are at loggerheads on other trade issues as well, including over Beijing's tight control of its market for rare-earth elements critical to high-tech manufacturing.

"There's a certain unhealthy development" in the solar-power industry, Miao Liansheng, chief executive of Yingli Green Energy Holding Co., YGE-0.38% said in an interview. He said $2.2 billion in supply deals Yingli signed earlier this year with U.S. companies could be at risk. He opposes retaliation by Beijing, he said.

At risk is a $3 billion market annually for U.S. companies that supply and license Chinese manufacturers, said Sun Guangbin, a top official at the government-backed China Chamber of Commerce for Import and Export of Machinery and Electronic Product, which is sponsoring the group.

Mr. Sun said low solar-equipment prices reflect the Chinese industry's size and efficiencies, as well as difficulties like oversupply and brand piracy, not unprofitable grabs for market share.

U.S.-listed shares of Chinese producers have slumped in recent days, and some of the companies have begun tallying costs associated with Washington policy. Suntech on Wednesday reported a first-quarter loss. The company's revenue dropped by more than half from a year earlier as prices and sales fell. Suntech said it booked $19.2 million in potential costs associated with the U.S. countervailing and antidumping tariffs

Suntech CEO Shi Zhengrong said his company will sustain sales momentum in the U.S. by supplying the market with panels made in other countries. "We have a global supply-chain operation," he said in an interview Thursday. Suntech is "disappointed" by the U.S. action but "fully prepared," he said.

Chinese manufacturers shipped nearly half the world's solar panels last year, representing more than 10,900 megawatts, while U.S. suppliers shipped just 3%, or about 780 megawatts, according to research by Paula Mints, an analyst at Navigant Consulting in Palo Alto, Calif.

U.S. policy isn't the only challenge for China's solar industry. A study published last year by three scholars at George Washington University estimated that Chinese companies will be able to make 38% more product than they can sell this year and predicted that the question of whether Chinese supply and demand can come into balance will depend on the bite of U.S. import policy and installation of the equipment in China.

Shawn Qu, CEO of Ontario-based Canadian Solar, which manufactures in China, suggested that U.S. consumers will face higher prices if the U.S. tariffs hold. "Ultimately it's the off taker, the customer, who pays," he said. The tariffs "will have a significant impact to, not only China-based manufacturers, but to the U.S. solar industry as well."

Thursday, May 24, 2012

Famed hurricane forecaster Dr. William Gray has issued his hurricane season forecast for 2012 and predicts below-average probability for major hurricanes making landfall. Dr. Gray is Professor Emeritus, Dept of Atmospheric Science, Colorado State University and a skeptic of man-made global warming. Dr. Gray contends that the global ocean's natural Meridional Overturning Circulation (MOC) "is the likely cause of most of the global warming that has been observed since the start of the industrial revolution (~1850) and for the more recent global warming that has occurred since the mid-1970s." In his paper, "Climate Change: Driven by the Ocean not Human Activity," Dr. Gray notes that observations show tropospheric water vapor has decreased with increased CO2, the opposite of the assumptions programmed into climate models, thus, "The predicted global warming due to a doubling of CO2 has been erroneously exaggerated by the [climate models] due to this water vapor feedback."

Abstract: This paper discusses how the variation in the global ocean’s Meridional Overturning Circulation (MOC) resulting from changes in the Atlantic Thermohaline Circulation (THC) and deep water Surrounding Antarctica Subsidence (SAS) can be the primary cause of climate change. (MOC = THC + SAS) is the likely cause of most of the global warming that has been observed since the start of the industrial revolution (~1850) and for the more recent global warming that has occurred since the mid-1970s. Changes of the MOC since 1995 are hypothesized to have lead to the cessation of global warming since 1998 and to the beginning of a weak global cooling that has occurred since 2001. This weak cooling is projected to go on for the next couple of decades.

Recent GCM global warming scenarios assume that a slightly stronger hydrologic cycle (due to the increase in CO2) will cause additional upper-level tropospheric water vapor and cloudiness. Such vapor-cloudiness increases are assumed to allow the small initial warming due to increased CO2 to be unrealistically multiplied 2-4 or more times. This is where most of the global warming from the GCMs comes from – not the warming resulting from the CO2 increase by itself but the large extra warming due to the assumed increase of upper tropospheric water vapor and cloudiness. As CO2 increases, it does not follow that the net global upper-level water vapor and cloudiness will increase significantly. Observations of upper tropospheric water vapor over the last 3-4 decades from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis data and the International Satellite Cloud Climatology Project (ISCCP) data show that upper tropospheric water vapor appears to undergo a small decrease while Outgoing Longwave Radiation (OLR) undergoes a small increase. This is opposite to what has been programmed into the GCMs. The predicted global warming due to a doubling of CO2 has been erroneously exaggerated by the GCMs due to this water vapor feedback.

CO2 increases without positive water vapor feedback could only have been responsible for about 0.1-0.2o C of the 0.6-0.7o C global mean surface temperature warming that has been observed since the early 20th century. Assuming a doubling of CO2 by the late 21st century (assuming no positive water vapor feedback), we should likely expect to see no more than about 0.3-0.5oC global surface warming and certainly not the 2-5oC warming that has been projected by the GCMs.

We anticipate that the 2012 Atlantic basin hurricane season will have reduced activity compared with the 1981-2010 climatology. The tropical Atlantic has anomalously cooled over the past several months, and it appears that the chances of an El Niño event this summer and fall are relatively high. We anticipate a below-average probability for major hurricanes making landfall along the United States coastline and in the Caribbean. However, coastal residents are reminded that it only takes one hurricane making landfall to make it an active season for them, and they need to prepare the same for every season, regardless of how much activity is predicted.

Birth rates are dropping all over the world, often below replacement rates. But not in the U.S.

Look around you. For most nations of the world, birth and fertility rates have never fallen so far, so fast, so long, so surprisingly, all across the globe. Except for America.

Seen globally, the population explosion—or what Stanford's Paul Ehrlich called "the population bomb" in the 1960s—is now stone-cold dead. The ramifications are enormous economically, geopolitically, culturally and personally. For one, the United States will become stronger than ever in the games nations play.

Every other major modern nation and every developing country has low or falling birth rates. Japan and Poland see 1.3 children per woman, Brazil and China 1.9, Pakistan 3.6 (down from 6.6 three decades ago). American fertility rates are relatively high, at nearly 2.1.

Having children is an affirmative act, so it's little surprise that surveys—Gallup, Harris and others—show Americans to be the most optimistic nation in the world. (Israel, too, is an optimistic nation with a sense of mission and high birth rates.)

Then there's the effect of immigration. According to the United Nations and the U.S. Census Bureau, the U.S. takes in more immigrants than the rest of the world combined. Think Albert Einstein, Madeleine Albright, Andy Grove, Albert Pujols, Sergei Brin, I.M. Pei or David Hockney.

Drive in the suburbs and exurbs of many major American cities and you will see McMansion homes with three, four or even five children—McPlenty—unheard of anywhere else. American couples can choose to have many children because the U.S. is one of the world's few suburban nations. In suburban settings, some affluent parents are deciding that for a decade or so raising a large family is more important than having two earners.

All this and more yields an America that is projected to have 400 million people in 2050, up from 310 million today and possibly on the way to 500 million by 2100. This may not quite play out—immigration from Mexico will likely fall as Mexican fertility drops off—but the trend lines are far stronger in the U.S. than elsewhere.

Getty Images

A hefty and growing population in America can yield power and influence.

According to one U.N. projection, the world population will fall to three billion or four billion by 2300 from seven billion today. That steep drop in birth rates includes rapidly developing countries such as India, China and Brazil, plus many Arab and Muslim countries. In these countries, birth rates will soon be, or already are, below "replacement levels"—meaning fewer than 2.1 children per woman, the number sufficient to "replace" both parents and account for those children who don't make it to reproductive age. Of developing countries, 33 of 155 have below-replacement rates today, and more are on the way.

Why is this so important to America? A hefty and growing population can yield power and influence. It's been a long time since a nation with a small population influenced how the world works—think the 16th-century Dutch and Portuguese.

Size also yields vast economies of scale. As population grows, through fertility and immigration, a healthy housing market is inevitable. It's either that or tens of millions of Americans sleeping on the streets. Bet on the boom.

There's corporate growth too, across industries. Imagine an American corporation, XYZ, that wants to start doing business in Thailand. Only in a polyglot nation like America can XYZ search out and find the adult children of Thai immigrants who know America inside and out but also know Thai customs and language.

Few if any nations have all these advantages. The demography in play guarantees that the 21st century, like the 20th, will be an "American Century."

Mr. Wattenberg, a senior fellow at the American Enterprise Institute and the Hudson Institute, is working on his 15th book, his third about population, from which this article is drawn.

Wednesday, May 23, 2012

The IPCC manufactures climate alarm by assuming CO2 controls water vapor to produce a runaway positive feedback system. Physicist Clive Best has posted his new paper showing that water vapor feedback is instead strongly negative, based on both the Faint Sun Paradox and a comparison of 5600 weather stations in the global CRUTEM4 temperature and humidity database. Peer-reviewed publications by Paltridge and others also find water vapor feedback is strongly negative. Without positive water vapor feedback, the IPCC's case for catastrophic man-made climate change collapses.

Abstract: Positive linear climate feedback for combined water effects is shown to be incompatible with the Faint Sun Paradox. In particular, feedback values of ~2.0 W/m2K-1favored by current GCM models lead to non physical results at solar radiation levels present one billion years ago. A simple model is described whereby Earth like planets with large liquid water surfaces can self-regulate temperature for small changes in incident solar radiation. The model assumes that reflective cloud cover increases while normalized greenhouse effects decrease as the sun brightens. Net water feedback of the model is strongly negative. Direct evidence for negative water feedback is found in CRUTEM4 station data by comparing temperature anomalies for arid regions (deserts and polar regions) with those for humid regions (mainly saturated tropics). All 5600 weather stations were classified according to the Köppen-Geiger climatology [9]. Two separate temperature anomaly series from 1900 to 2011 were calculated for each region. A clear difference in temperature response is observed. Assuming the difference is due to atmospheric water content, a water feedback value of -1.5 +/- 0.8 W/m2K-1 can be derived.

I. INTRODUCTION

The Faint Sun Paradox was first proposed by Carl Sagan [1] who pointed out that the geological evidence that liquid oceans existed on Earth 4 billion years ago appears incompatible with a solar output 30% dimmer than today. The sun is a main sequence star whose output is known to increase slowly with age. The total change in solar radiation over this long period turns out to be huge ~ 87 W/m2. It has been argued that an enhanced greenhouse effect due to very high CO2 and/or CH4 concentrations could resolve this paradox [2]. However, recent geological evidence does not support CO2 as being responsible but instead the authors propose a greater ocean surface leading lower albedo as a likely solution [3]. Others have suggested that high cirrus clouds effectively warmed the Earth [4]. Although the atmosphere must have been very different before photosynthesis began, the presence of large liquid oceans still implies that clouds and water vapor played a similar role in the Earth’s energy balance then, as they do today.

II. MODELS

All current IPCC models adopt net positive feedbacks for water vapor and clouds [6]. A doubling of CO2 increases TOA radiative forcing by ~3.6 W/m2 causing a baseline surface temperature rise of about 1°C to restore global energy balance through increased outgoing Infrared [5]. GCM models predict larger temperature rises ranging from 2-5°C due to these positive feedbacks. What do positive feedbacks imply for the Faint Sun Paradox? For a change in forcing DS, a feedback strength F, and G0 as the baseline response, the temperature rise DT is given by

DT = (DS +F.DT)G0

Black body radiation from the Earth’s surface is the primary negative feedback to any temperature rise DT.

DT =DS/(1/ G0 -F) 1/ G0 = 4?T3 = 3.75 W/m2K-1 T=288K

GCM model feedbacks F range from +1.6 to 2.5 with an average positive feedback of ~ 2.0 W/m2K-1 [6].

The sun has brightened 30% over the last 4 billion years and current average incident solar radiation is ~342 watts/m2. Assuming a slow linear increase of solar radiation with time yields a net forcing increase of 0.02 W/m2 every 1 million years. The temperature response to this forcing has been calculated for feedback values F=-2, 0, +2. This can be integrated backwards 4 billion years from current temperatures. The results are shown in Figure 1.

It is apparent that a simple linear positive feedback of +2 leads to unphysical results. The basic problem is that if the temperature falls sufficiently so that 4?T3= F then a singularity occurs ~1.5 billion years ago. Instead a negative feedback value of -2 W/m2K-1 is more compatible both with current temperatures and with the Faint Sun Paradox..

The evidence is that global surface temperatures have changed rather little over the Earth’s history. It therefore seems likely that feedbacks were negative during the early lifetime of the Earth to avoid run away surface heating as the sun brightened. The continuous ~70% surface coverage of water on Earth has apparently stabilized global temperatures. A simple model of how this could work is described next, in analogy with Daisy World proposed by James Lovelock to justify Gaia theory [7].