After the alarm caused by Al Gore's film "An Inconvenient Truth" in 2006,
these are my findings about the drivers of Earth's global climate.
Along the way, I learned about the "Greenhouse Theory" and how water vapor,
methane and carbon dioxide delay outgoing heat radiation to space. I also learned about how atmospheric convection cools the Earth.
I learned about global climate as a statistical construct.
I learned about the Sun and its cycles, and about how they correlate to climate changes in the past.
I learned about continental drift and the ice core drillings,
and how climate has always been changing from the beginnings of our planet, some 4,500 million years ago,
and has continued to change since the origins of carbon-based life, some 3,800 million years ago.
I learned about the Milankovitch cycles and the celestial origin of the profound long-range climate oscillations.
I also learned about the "El Niño" and "La Niña" transferring heat across the Pacific Ocean,
and their multidecadal effects on the global climate.
I studied the global temperature anomaly records and what they seemed to show,
and about their problems with the warm microclimates near cities and inside towns and airports, where most thermometers are located.
I studied the satellite-based global temperature records since 1979.
I have also learned about the United Nations Inter-governmental Panel on Climate Change,
about how this political-scientific organization went about preparing their alarming Assessment Reports.
I have read theory on the effects of the Solar wind on the formation of clouds that
shade the Earth reflecting sunlight out to space.
I have learned that carbon dioxide is the gas of life for carbon-based creatures on Earth.

I try to present this science in this long page.
I am thankful to the many scientists that present their work with clarity.
I am thankful to the many professional and amateur "climate auditors" that will not
let the scientific method be trampled on by politics.

"Carbon restriction policies, to have any effect on climate,
would require that the most extreme projections of dangerous climate actually be correct,
and would require massive reductions in the use of energy to be universally adopted.
There is little question that such reductions would have negative impacts on income, development, the environment,
and food availability and cost - especially for the poor. This would clearly be immoral."

"By contrast, the reasonable and moral policy would be to foster economic growth,
poverty reduction and well being in order that societies be better able to deal with climate change regardless of its origin.
Mitigation policies appear to have the opposite effect without significantly reducing the hypothetical risk of any changes in climate.
While reducing vulnerability to climate change is a worthy goal, blind support for mitigation measures
- regardless of the invalidity of the claims - constitutes what might be called bankrupt morality."

"The climate system is particularly challenging since it is known that components in the system are inherently chaotic;
there are feedbacks that could potentially switch sign,
and there are central processes that affect the system in a complicated, non-linear manner.
These complex, chaotic, non-linear dynamics are an inherent aspect of the climate system."

"The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.
Rather the focus must be upon the prediction of the probability distribution of the system's future possible states
by the generation of ensembles of model solutions.
Addressing adequately the statistical nature of climate is computationally intensive and requires the application of new methods of model diagnosis,
but such statistical information is essential."

The Earth's climate system is highly nonlinear: inputs and outputs are not proportional, change is often episodic and abrupt,
rather than slow and gradual, and multiple equilibria are the norm.
While this is widely accepted, there is a relatively poor understanding of the different types of nonlinearities,
how they manifest under various conditions, and whether they reflect a climate system driven by astronomical forcings,
by internal feedbacks, or by a combination of both.
In this paper, after a brief tutorial on the basics of climate nonlinearity,
we provide a number of illustrative examples and highlight key mechanisms that give rise to nonlinear behavior,
address scale and methodological issues,
suggest a robust alternative to prediction that is based on using integrated assessments within the framework of vulnerability studies and,
lastly, recommend a number of research priorities and the establishment of education programs in Earth Systems Science.
It is imperative that the Earth's climate system research community embraces this nonlinear paradigm
if we are to move forward in the assessment of the human influence on climate.

Climate is long-range weather, it is a description of the average or prevailing weather in each season along the years.
The climate varies widely among different regions on Earth. Also, in some regions it varies more or less widely with the seasons.

Predicting the weather for a particular region, even for a few days in advance, is one of the most complex problems in science.
One basis for these predictions is that dayly weather changes slowly,
so tomorrow's weather will be similar to today's weather and so on (but with less certainty the further we go on,
because the weather is mathematically chaotic).
Also, that weather tends to repeat seasonally, for each region, changing with the years, sometimes abruptly.

The most obvious local cycle in the weather pattern is diurnal, basically controlled by the Sun.
The warmest time of the day is usually after noon, after receiving the highest energy influx.
The coldest is usually before dawn, after cooling all night.

The second cycle in importance is yearly, also controlled by the amount of energy received from the Sun.
The warmest days occur in the summer and the coldest in the winter,
when one hemisphere is tilted towards the Sun while the other is tilted away.
The axis of the Earth is currently tilted 23.5° in respect to its orbital plane.
The northern hemisphere contains most of the land, the southern hemisphere contains most of the ocean.

The Sun-Earth Connection:

Then there is the cycle of activity of the Sun itself, some 11 years, but not very constant in length or intensity.
Some of the effects of the Solar activity on the Earth's atmosphere are now just beginning to be studied.
The reconstructions of ancient climates reveal a close correlation between Solar activity and temperatures on Earth.
The correlation between Solar activity plus oceanic heat transport and temperatures
is much more closer than the correlation between the abundance of atmospheric carbon dioxide (CO2) and temperatures.

There is "a pronounced influence of solar activity on global climatic processes" related to "temperature,
precipitation and atmospheric and oceanic circulation".
"The temporal synchrony between the Maunder, Sporer, and Wolf minima and the expansion of Alpine glaciers
further points to a climate response to the deep solar minima".
From Solar Forcing of Multiple Climatic Parameters
(Sherwood, Keith and Craig Idso. CO2 Science. June 4, 2008)

"Scientists have only recently come to suspect that cosmic rays have an important influence on Earth's climate.
Cosmic rays are highly energetic charged particles that originate from various sources in outer space."
"Scientists have found a link between cosmic ray levels and thunderstorms.
There is also a positive correlation between cosmic ray flux (CRF) and low-altitude cloud formation."
"Ions created in the troposphere by cosmic rays could provide a mechanism for cloud formation."
"The influence of galactic cosmic ray modulation is strongest on low-level clouds."
"When the Sun is active, its magnetic field is stronger and as a result fewer global cosmic rays (GCR) arrive in the vicinity of Earth."
"The variations of the cosmic ray flux, as predicted from the galactic model and as observed from the iron meteorites,
are in sync with the occurrence of ice age epochs on Earth. The agreement is both in period and in phase."
"The inverse relationship between temperature and CRF is clear; when CRF rises, temperature falls, when CRF drops off, temperature climbs."
"The evidence of correlations between paleoclimate records and solar and cosmic ray activity indicators,
suggests that extraterrestrial phenomena are responsible for climatic variability on time scales ranging from days to millennia."
"The movement of the solar system in and out of the spiral arms of the Milky Way galaxy
is responsible for changes in the amount of cosmic rays impacting Earth's atmosphere."
"Cosmic Ray Flux variations explain more than two-thirds of the variance in the reconstructed temperature,
making CRF variability the dominant climate driver over geologic time scales."
"It has been known for some time that a 62±3 million-year cycle in fossil diversity has persisted over the past 542 million years."
"Recently, it has been proposed that the cycle is caused by modulation of CRF due to the solar system's vertical oscillation in the galaxy,
which has a period of around 64 million years."

For Solar Cycle 24, the first maximum number of Sunspots occurred in November 2011 (Ri=96.7).
A second, higher, peak in sunspot number occurred in February 2014 (Ri=102.3).

Daily and monthly international sunspot number Ri: last 13 years and forecasts

Daily sunspot number (yellow), monthly mean sunspot number (blue),
smoothed monthly sunspot number (red) for the last 13 years and 12-month ahead predictions of the monthly smoothed sunspot number:
SC method (red dots):
prediction method based on an interpolation of Waldmeier's standard curves; It is only based on the sunspot number series.
CM method (red dashes):
(from K. Denkmayr and P. Cugnon) combining a regression technique applied to the sunspot number series
with the aa geomagnetic index used as a precursor (improved predictions during the minimum phase between solar cycles).

The yearly averaged sunspot number for a period of 400 years (1610-2015)
The Maunder Minimum is shown during the second half of the 17th century
Solar Physics Group, NASA Marshall Space Flight Center [August 25, 2015]

"The Maunder Minimum:
Early records of sunspots indicate that the Sun went through a period of inactivity in the late 17th century.
Very few sunspots were seen on the Sun from about 1645 to 1715
(JPG image. August 25, 2015).
Although the observations were not as extensive as in later years,
the Sun was in fact well observed during this time and this lack of sunspots is well documented.
This period of solar inactivity also corresponds to a climatic period called the "Little Ice Age"
when rivers that are normally ice-free froze and snow fields remained year-round at lower altitudes.
There is evidence that the Sun has had similar periods of inactivity in the more distant past.
The connection between solar activity and terrestrial climate is an area of on-going research."

"The sunspot number (SSN) record (1610-present) is the primary time sequence of solar and solar-terrestrial physics,
with application to studies of the solar dynamo, space weather, and climate change.
Contrary to common perception, and despite its importance,
the international sunspot number (as well as the alternative widely-used group SSN) series is inhomogeneous and in need of calibration.
We trace the evolution of the sunspot record and show that significant discontinuities arose in ~1885
(resulting in a ~50% step in the group SSN) and again when Waldmeier took over from Brunner in 1945 (~20% step in Zürich SSN).
We follow Wolf and show how the daily range of geomagnetic activity can be used to maintain the sunspot calibration
and use this technique to obtain a revised, homogeneous, and single sunspot series from 1835-2011."

Svante Arrhenius (Physicist/Chemist, Sweden, 1859-1927) proposed in 1896 a theory to account for the Earth's ice ages,
he was the first scientist to speculate that changes in the levels of carbon dioxide in the atmosphere
could substantially alter the surface temperature of the Earth through a "greenhouse effect".

He suggested that the human emission of CO2 would be strong enough to prevent the world from entering a new ice age,
and that a warmer earth would be needed to feed the rapidly increasing population.
He was the first person to predict that emissions of carbon dioxide from the burning of fossil fuels and other combustion processes
would cause global warming.

In 1896 Arrhenius estimated that a halving of CO2 would decrease temperatures by 4-5°C
and a doubling of CO2 would cause a temperature rise of 5-6°C.
In his 1906 publication Arrhenius adjusted this value down to 1.6°C (including water vapor feedback: 2.1°C).

Recent estimates from IPCC (2007) say this value (the Climate Sensitivity) is likely to be between 2 and 4.5°C.
But Sherwood Idso in 1998 calculated the Climate Sensitivity to be 0.4°C, and more recently Richard Lindzen at 0.5°C.
Roy Spencer calculated 1.3°C in 2011.

"The average value of the best estimate of the equilibrium climate sensitivity across all the new studies is about 2.0°C.
The average climate sensitivity of the climate models used by the IPCC to project future climate changes (and their impacts)
is about 3.4°C - some 70 percent higher than the recent studies indicate."

"Nic Lewis and Judith Curry just published a blockbuster paper that pegs the Earth's equilibrium climate sensitivity
- how much the Earth's average surface temperature is expected to rise in association with
a doubling of the atmosphere's carbon dioxide concentration - at 1.64°C (1.05°C to 4.05°C, 90% range),
a value that is nearly half of the number underpinning all of President Obama's executive actions under his Climate Action Plan."

According to the AR4 report, the "likely equilibrium range of sensitivity" was 2.0 to 4.5°C per CO2 doubling.
According to the newer AR5 report, it is 1.5 to 4.5°C, i.e., the likely equilibrium sensitivity is now known less accurately.
But they write: "This assessment reflects improved understanding".
How ridiculous can you be?

I think the real reason why there is no improvement in the understanding of climate sensitivity is the following.
If you have a theory which is correct, then as progressively more data comes in, the agreement becomes better.
Sure, occasionally some tweaks have to be made, but overall there is an improved agreement.
However, if the basic premises of a theory are wrong, then there is no improved agreement as more data is collected.
In fact, it is usually the opposite that takes place, the disagreement increases.
In other words, the above behavior reflects the fact that the IPCC and alike are captives of a wrong conception.

New Report: Climate Less Sensitive To CO2 Than Models Suggest [March 5, 2014]

Oversensitive: How The IPCC Hid The Good News On Global Warming

A new report published today by the Global Warming Policy Foundation
shows that the best observational evidence indicates our climate is considerably less sensitive to greenhouse gases than climate models are estimating.

The clues for this and the relevant scientific papers are all referred to in the recently published
Fifth Assessment report (AR5) of the Intergovernmental Panel on Climate Change (IPCC).
However, this important conclusion was not drawn in the full IPCC report - it is only mentioned as a possibility -
and is ignored in the IPCC's Summary for Policymakers (SPM).

For over thirty years climate scientists have presented a range for climate sensitivity (ECS) that has hardly changed.
It was 1.5-4.5°C in 1979 and this range is still the same today in AR5.
The new report suggests that the inclusion of recent evidence, reflected in AR5,
justifies a lower observationally-based temperature range of 1.25-3.0°C, with a best estimate of 1.75°C, for a doubling of CO2.
By contrast, the climate models used for projections in AR5 indicate a range of 2-4.5°C, with an average of 3.2°C.

This is one of the key findings of the new report Oversensitive: how the IPCC hid the good news on global warming,
written by independent UK climate scientist Nic Lewis and Dutch science writer Marcel Crok.
Lewis and Crok were both expert reviewers of the IPCC report, and Lewis was an author of two relevant papers cited in it.

In recent years it has become possible to make good empirical estimates of climate sensitivity
from observational data such as temperature and ocean heat records.
These estimates, published in leading scientific journals,
point to climate sensitivity per doubling of CO2 most likely being under 2°C for long-term warming,
with a best estimate of only 1.3-1.4°C for warming over a seventy year period.

"The observational evidence strongly suggest that climate models display too much sensitivity to carbon dioxide concentrations
and in almost all cases exaggerate the likely path of global warming", says Nic Lewis.

These lower, observationally-based estimates for both long-term climate sensitivity and the seventy-year response suggest that
considerably less global warming and sea level rise is to be expected in the 21st century than most climate model projections currently imply.

"We estimate that on the IPCC's second highest emissions scenario warming would still be around the international target of 2°C in 2081-2100", Lewis says.

"There appears to be a widespread belief that the comparatively high temperature produced within a closed space covered with glass,
and exposed to solar radiation, results from a transformation of wave-length, that is,
that the heat waves from the sun, which are able to penetrate the glass, fall upon the walls of the enclosure and raise its temperature:
the heat energy is re-emitted by the walls in the form of much longer waves,
which are unable to penetrate the glass, the greenhouse acting as a radiation trap."

"I have always felt some doubt as to whether this action played any very large part in the elevation of temperature.
It appeared much more probable that the part played by the glass
was the prevention of the escape of the warm air heated by the ground within the enclosure.
If we open the doors of a greenhouse on a cold and windy day,
the trapping of radiation appears to lose much of its efficacy."

"As a matter of fact I am of the opinion that a greenhouse made of a glass transparent to waves of every possible length
would show a temperature nearly, if not quite, as high as that observed in a glass house.
The transparent screen allows the solar radiation to warm the ground,
and the ground in turn warms the air, but only the limited amount within the enclosure.
In the 'open', the ground is continually brought into contact with cold air by convection currents."

"To test the matter I constructed two enclosures of dead black cardboard,
one covered with a glass plate, the other with a plate of rock-salt of equal thickness.
The bulb of a thermometer was inserted in each enclosure and the whole packed in cotton,
with the exception of the transparent plates which were exposed."

"When exposed to sunlight the temperature rose gradually to 65°C,
the enclosure covered with the salt plate keeping a little ahead of the other,
owing to the fact that it transmitted the longer waves from the sun,
which were stopped by the glass.
In order to eliminate this action the sunlight was first passed through a glass plate.
There was now scarcely a difference of one degree between the temperatures of the two enclosures.
The maximum temperature reached was about 55°C.
From what we know about the distribution of energy in the spectrum of the radiation emitted by a body at 55°C,
it is clear that the rock-salt plate is capable of transmitting practically all of it, while the glass plate stops it entirely.
This shows us that the loss of temperature of the ground by radiation is very small in comparison to the loss by convection,
in other words that we gain very little from the circumstance that the radiation is trapped."

"Is it therefore necessary to pay attention to trapped radiation in deducing the temperature of a planet as affected by its atmosphere?
The solar rays penetrate the atmosphere, warm the ground which in turn warms the atmosphere by contact and by convection currents.
The heat received is thus stored up in the atmosphere, remaining there on account of the very low radiating power of a gas.
It seems to me very doubtful if the atmosphere is warmed to any great extent by absorbing the radiation from the ground,
even under the most favourable conditions."

"I do not pretend to have gone very deeply into the matter,
and publish this note merely to draw attention to the fact that
trapped radiation appears to play but a very small part in the actual cases with which we are familiar."

Robert Williams Wood (1868-1955) was an American physicist and inventor.
See Robert W. Wood (Wikipedia)

Does a Greenhouse Operate through the Greenhouse Effect?

"One of the oft-cited objections to the term 'greenhouse effect' is that it is a misnomer,
that a real greenhouse (you know, the kind you grow plants in) doesn't work by inhibiting infrared energy loss.
It is usually claimed that a real greenhouse works by inhibiting convective heat loss by trapping the sun-heated air inside."

"The convective heat loss by the greenhouse roof (200 W/m2, inferred as a residual)
is only 8 W/m2 less than if the greenhouse was not there (208 W/m2).
In contrast, the extra IR energy "input" (actually, reduced IR "loss") is twelve times as large (100 W/m2)
as the reduction in the convective loss (8 W/m2)."

"Of course, changing any of the assumed numbers will change the result.
But, assuming I haven't made a fundamental mistake,
I think you would find that the 'greenhouse effect' will consistently be larger than the convective inhibition effect."

"So, maybe the greenhouse effect really does work like a real greenhouse."

"Over the course of the past 2 decades,
I have analyzed a number of natural phenomena that reveal how Earth's near-surface air temperature responds
to surface radiative perturbations.
These studies all suggest that a 300 to 600 ppm doubling of the atmosphere's CO2 concentration
could raise the planet's mean surface air temperature by only about 0.4°C.
Even this modicum of warming may never be realized, however, for it could be negated by a number of planetary cooling forces
that are intensified by warmer temperatures and by the strengthening of biological processes that are enhanced
by the same rise in atmospheric CO2 concentration that drives the warming.
Several of these cooling forces have individually been estimated to be of equivalent magnitude, but of opposite sign,
to the typically predicted greenhouse effect of a doubling of the air's CO2 content,
which suggests to me that little net temperature change will ultimately result from the ongoing buildup of CO2 in Earth's atmosphere.
Consequently, I am skeptical of the predictions of significant CO2-induced global warming
that are being made by state-of-the-art climate models
and believe that much more work on a wide variety of research fronts will be required to properly resolve the issue."

"The climate of the Earth is profoundly affected by two competing processes:
the greenhouse effect, which acts to warm the lower atmosphere and cool the upper atmosphere,
and atmospheric convection (thermals, clouds, precipitation) which does just the opposite:
cools the lower atmosphere and warms the upper atmosphere."

"There would be no weather on Earth without the greenhouse effect."
"Since it is the convective overturning of the atmosphere that causes most of what we recognize as 'weather',
most weather activity on Earth would stop, too.
Atmospheric convective overturning is what causes clouds and rainfall.
In the tropics, it occurs in relatively small and strongly overturning thunderstorm-type weather systems.
At higher latitudes,
that convection occurs in much larger but more weakly overturning cloud and precipitation systems associated with low pressure areas."

"Why would this occur? Infrared absorbers like water vapor and carbon dioxide provide an additional heating mechanism for the atmosphere.
But at least as important is the fact that, since infrared absorbers are also infrared emitters,
the presence of greenhouse gases allow the atmosphere - not just the surface - to cool to outer space."

"As Dick Lindzen alluded to back in 1990, while everyone seems to understand that the greenhouse effect warms the Earth's surface,
few people are aware of the fact that weather processes greatly limit that warming.
And one very real possibility is
that the 1 deg. C direct warming effect of doubling our atmospheric CO2 concentration by late in this century
will be mitigated by the cooling effects of weather to a value closer to 0.5 deg. C or so (about 1 deg. F).
This is much less than is being predicted by the UN's Intergovernmental Panel on Climate Change or by NASA's James Hansen,
who believe that weather changes will amplify, rather than reduce, that warming."

The early climate of Venus is thought to have been controlled by a "runaway" atmospheric greenhouse effect that evaporated its oceans.
CO2 is now near 96.5% in its atmosphere (3.5% is nitrogen) and the surface of Venus receives little direct visible sunlight.
The Venusian atmosphere is full of dense, high clouds;
30 to 40 Km thick with bases at 30 or 35 Km of altitude.
Venusian climate is determined by its distance to the Sun (0.72 A.U.), its higher albedo and its atmospheric density.

Our atmosphere is not totally cloud-covered, as is Venus (albedo of 0.76);
globally, about 40% of the sky is always clear on Earth (albedo of 0.37).
Venus has an extremely high atmospheric pressure; 90 times greater than on Earth,
and the mean surface temperature on Venus is 465°C, 15°C on Earth.

Extraterrestrial Climate Influences:

What are cosmic rays?

When we look at the sky, we see bright objects: the Sun of course, planets, stars, nebulae ... All this is light, electromagnetic waves.
With specialised telescopes, we can also detect electromagnetic waves that are invisible to the human eye,
such as infrared or ultraviolet emission, radio waves, X-rays.

Since the early 20th century we know that the Earth is not only hit by such waves, but also bombarded by charged energetic particles:
protons, ions, electrons that come in at nearly the speed of light.
These particles are called cosmic rays, and they tell us a story about the Universe that we would not learn from light alone.

Cosmic rays provide a tool to explore the Universe, but they also directly affect the Earth.
We want to observe these particles to understand their origin, to use them as a tracer of solar disturbances,
and to monitor their effects on technology and human beings.

Where do they come from?

Cosmic rays come from places in the Universe where some kind of explosion occurs: the remnants of stellar explosions (supernovae),
active galaxies, and also from the Sun.

Galactic cosmic rays come in permanently, although their intensity is modulated by the Sun.
Particles accelerated at the Sun, solar cosmic rays, are more sporadic.
They come as individual events, on top of the usual particle flux from the remote Universe.

How can we observe them?

Cosmic rays do not directly hit the ground, but collide with the atoms of the high atmosphere.
That creates lots of secondary particles: protons, neutrons, muons and electrons.
Provided the primary particle has a minimum speed of about 200,000 km/s, two thirds of the speed of light,
a significant number of secondary nucleons,
muons and other particles can be detected by ground-based particle counters near the magnetic poles.

The magnetic field of the Earth is another filter, although it plays no role at the magnetic poles of the Earth.
But the closer one comes to the equator, the faster the primary charged particle must be to get across the magnetic field.
Particle counters at different places on the Earth therefore measure cosmic rays of different minimum speeds
- they reveal the energy spectrum of cosmic rays.

What is a neutron monitor?

In order to increase the number of particles that are eventually detected, the counters of neutron monitors are surrounded by lead.
There the secondary nucleons, and a few muons, produce further neutrons.
The neutron monitor counts these neutrons, but they ultimately reveal the cosmic ray flux at the top of the atmosphere.

Neutron monitors have been used since the 1950s.
They are still the state-of-the-art instrumentation for measuring cosmic rays from the Sun
and the low-energy component of cosmic rays from elsewhere in the Universe.

Why do cosmic rays matter?

Cosmic rays are a formidable source of information about the violent Universe.
We want to know under which circumstances and how charged particles are accelerated to such high energies or speeds.

Cosmic rays can be used to monitor perturbations of the interplanetary medium that might hit the Earth.
Many years of observation have shown that the galactic cosmic ray intensity is modulated by the magnetic field of the Heliosphere:
when the Sun has many spots, the magnetic field is strong in the Heliosphere,
and the intensity of galactic cosmic rays is reduced at the Earth.
When there are no spots, the shielding is weak, and many cosmic rays reach the Earth.
Faster intensity variations can be generated by solar eruptions, where magnetic fields are expelled into the Heliosphere.

Furthermore, cosmic rays have some impact on the Earth.
They affect the Earth's atmosphere:
by the secondary particles they produce when colliding with atmospheric atoms, and by the ionisation of atmospheric atoms.
Fast charged particles are a source of irradiation, as are X-rays.
While there seems to be little effect on the ground, civil aircraft crew are less protected by the atmosphere and have to be monitored.
Neutron monitor measurements provide the basic data.

"Scientists have only recently come to suspect that cosmic rays have an important influence on Earth's climate.
Cosmic rays are highly energetic charged particles that originate from various sources in outer space."
"Scientists have found a link between cosmic ray levels and thunderstorms.
There is also a positive correlation between cosmic ray flux (CRF) and low-altitude cloud formation."
"Ions created in the troposphere by cosmic rays could provide a mechanism for cloud formation."
"The influence of galactic cosmic ray modulation is strongest on low-level clouds."
"When the Sun is active, its magnetic field is stronger and as a result fewer global cosmic rays (GCR) arrive in the vicinity of Earth."
"The variations of the cosmic ray flux, as predicted from the galactic model and as observed from the iron meteorites,
are in sync with the occurrence of ice age epochs on Earth. The agreement is both in period and in phase."
"The inverse relationship between temperature and CRF is clear; when CRF rises, temperature falls, when CRF drops off, temperature climbs."
"The evidence of correlations between paleoclimate records and solar and cosmic ray activity indicators,
suggests that extraterrestrial phenomena are responsible for climatic variability on time scales ranging from days to millennia."
"The movement of the solar system in and out of the spiral arms of the Milky Way galaxy
is responsible for changes in the amount of cosmic rays impacting Earth's atmosphere."
"Cosmic Ray Flux variations explain more than two-thirds of the variance in the reconstructed temperature,
making CRF variability the dominant climate driver over geologic time scales."
"It has been known for some time that a 62±3 million-year cycle in fossil diversity has persisted over the past 542 million years."
"Recently, it has been proposed that the cycle is caused by modulation of CRF due to the solar system's vertical oscillation in the galaxy,
which has a period of around 64 million years."

"Decadal - Cosmic ray muons regulated by the Solar cycle. This accounts for temperature variability in sync with the 11 year sunspot cycle."
"Hundreds to thousands of years - Solar regulation of cosmic rays plus changes in Solar irradiance.
This variability includes historical climate change as witnessed in the Little Ice Age and Medieval Warm Period."
"Tens to hundreds of thousands of years - The Croll-Milankovitch cycles that combine Earth's attitudinal and orbital variations.
This variability drives the glacial-interglacial cycles during ice ages."
"Millions to hundreds of millions of years - The solar system's transit of the galactic spiral arms,
causing variation in overall cosmic ray intensity.
This variability regulates the cycles of ice ages and hot-house periods."

Do clouds disappear when cosmic rays get weaker?:
Calder's Updates, May 3, 2010

"The Sun makes fantastic natural experiments" Henrik Svensmark says,
"that allow us to test our ideas about its effects on the Earth's climate".
Most dramatic are the events called Forbush decreases.
Ejections of gas from the Sun, carrying magnetic fields,
can suddenly cut the influx of cosmic rays coming to the Earth from exploded stars.

According to the Svensmark hypothesis, cosmic rays seed the formation of low clouds,
so there should be a reduction in the Earth's low cloud cover in the aftermath of a Forbush decrease.

With the right tracking skills, the Copenhagen team confirmed all their expectations about the Forbush decreases.

Combined data for the five strongest Forbush decreases since 1998 show a loss of fine aerosols from the atmosphere,
especially about 5 days after the cosmic ray minimum (red curve).
Within a few days after that, three different sets of data from satellites revealed the loss of low, wet clouds,
with clouds over the oceans holding about 7% less liquid water than they did before the events.
Dates of the five Forbush minima, ranked in order of the downturn in ionization of the lower air,
compared with the overall variation in the course of a solar cycle,
were 31/10/2003 (119%), 19/1/2005 (83%), 13/9/2005 (75%), 16/7/2000 (70%) and 12/4/2001 (64%).

The first of the graphics shows a temporary shortage of fine aerosols,
chemical specks in the air that normally grow until water vapour can condense on them,
so seeding the liquid water droplets of low-level clouds.
The remaining three graphs display the observable loss of the clouds that would have been seeded
if the aerosols had survived to do their job. Three different kinds of satellite observations tell the same story.

Cosmic rays continuously promote the formation of micro-clusters of sulphuric acid and water molecules,
but initially these are far too small to be detectable by remote observation.
After growing routinely over a number of days
the invisible specks floating in the air influence the normal colour of sunlight as seen from the ground,
by scattering away its violet light.
Conversely, a shortage of fine aerosols after a shortage of cosmic rays
should make the Sun appear abnormally bright in at the violet end of the spectrum.

As the graphs above show,
all of these observational data sets showed much the same pattern of events after the strongest Forbush decreases since 1998,
namely a decrease in liquid water clouds that reached its lowest point six to nine days after the minimum count of cosmic rays.

As for the magnitude of the impact on cloud cover, it was huge.
A 7% decrease in cloud water seen by SSM/I translates into 3 billion tonnes of liquid water vanishing from the sky.
The water remains there in vapour form, but unlike cloud droplets it does not block sunlight trying to warm the ocean.
After the same five Forbush decreases, the extent of liquid-water clouds measured by MODIS fell on average by 4%,
while ISCCP showed 5% less cloud below 3200 metres over the ocean.

From solar activity to cosmic ray ionization to aerosols and liquid-water clouds, a causal chain appears to operate on a global scale.

Although they are too short-lived to have a lasting effect on the climate,
the Forbush decreases dramatize the cosmic climate mechanism that works more patiently during the 11-year solar cycle.
When the Sun becomes more active, the decline in low-altitude cosmic radiation is greater than that seen in most Forbush events,
and the loss of low cloud cover persists for long enough to warm the world.
That explains the alternations of warming and cooling seen in the lower atmosphere and in the oceans during solar cycles.
And the overall increase in solar activity during the 20th Century
implies a loss of low clouds sufficient to explain most of the "global warming".

It has been proposed that galactic cosmic rays may influence the Earth's climate by affecting cloud formation.
If changes in cloudiness play a part in climate change, their effect changes sign in Antarctica.
Satellite data from the Earth Radiation Budget Experiment (ERBE)
are here used to calculate the changes in surface temperatures at all latitudes,
due to small percentage changes in cloudiness.
The results match the observed contrasts in temperature changes, globally and in Antarctica.
Evidently clouds do not just respond passively to climate changes but take an active part in the forcing,
in accordance with changes in the solar magnetic field that vary the cosmic-ray flux.

Cloud tops have a high albedo and exert their cooling effect by scattering back into the cosmos much of the sunlight
that could otherwise warm the surface.
But the snows on the Antarctic ice sheets are dazzlingly white, with a higher albedo than the cloud tops.
There, extra cloud cover warms the surface, and less cloudiness cools it.
Satellite measurements show the warming effect of clouds on Antarctica,
and meteorologists at far southern latitudes confirm it by observation.
Greenland too has an ice sheet, but it is smaller and not so white.
And while conditions in Greenland are coupled to the general climate of the northern hemisphere,
Antarctica is largely isolated by vortices in the ocean and the air.
The cosmic-ray and cloud-forcing hypothesis therefore predicts
that temperature changes in Antarctica should be opposite in sign to changes in temperature in the rest of the world.
This is exactly what is observed, in a well-known phenomenon that some geophysicists have called the polar see-saw,
but for which "the Antarctic climate anomaly" seems a better name.

Today the Royal Astronomical Society in London publishes (online) Henrik Svensmark's latest paper entitled
"Evidence of nearby supernovae affecting life on Earth".
After years of effort Svensmark shows how the variable frequency of stellar explosions not far from our planet
has ruled over the changing fortunes of living things throughout the past half billion years.
Appearing in Monthly Notices of the Royal Astronomical Society,
It's a giant of a paper, with 22 figures, 30 equations and about 15,000 words.
See the RAS press release at
Did exploding stars help life on Earth to thrive?
(Professor Henrik Svensmark, Royal Astronomical Society, April 24 2012)

For Svensmark, the changes driven by the stars govern the amount of carbon dioxide in the air.
Climate and life control CO2, not the other way around.

Observations of open star clusters in the Solar neighbourhood are used to calculate local supernova (SN) rates for the past 510 Myr.
Peaks in the SN rates match passages of the Sun through periods of locally increased cluster formation which could be caused
by spiral arms of the Galaxy.
A statistical analysis indicates that the Solar system has experienced many large short-term increases in the flux
of Galactic cosmic rays (GCR) from nearby SNe.
The hypothesis that a high GCR flux should coincide with cold conditions on the Earth
is borne out by comparing the general geological record of climate over the past 510 Myr with the fluctuating local SN rates.
Surprisingly, a simple combination of tectonics (long-term changes in sea level) and astrophysical activity (SN rates)
largely accounts for the observed variations in marine biodiversity over the past 510 Myr.
An inverse correspondence between SN rates and carbon dioxide (CO2) levels is discussed in terms of
a possible drawdown of CO2 by enhanced bio-productivity in oceans that are better fertilized in cold conditions
- a hypothesis that is not contradicted by data on the relative abundance of the heavy isotope of carbon,13C.

Researchers in the Technical University of Denmark (DTU)
are hard on the trail of a previously unknown molecular process that helps commonplace clouds to form.
Tests in a large and highly instrumented reaction chamber in Lyngby, called SKY2,
demonstrate that an existing chemical theory is misleading.

Back in 1996 Danish physicists suggested that cosmic rays, energetic particles from space, are important in the formation of clouds.
Since then, experiments in Copenhagen and elsewhere have demonstrated that cosmic rays actually help small clusters of molecules to form.
But the cosmic-ray/cloud hypothesis seemed to run into a problem
when numerical simulations of the prevailing chemical theory pointed to a failure of growth.

Fortunately the chemical theory could also be tested experimentally, as was done with SKY2,
the chamber of which holds 8 cubic metres of air and traces of other gases.
One series of experiments confirmed the unfavourable prediction that the new clusters would fail to grow sufficiently
to be influential for clouds.
But another series of experiments, using ionizing rays, gave a very different result, as can be seen in the accompanying figure.

The reactions going on in the air over our heads mostly involve commonplace molecules.
During daylight hours, ultraviolet rays from the Sun encourage sulphur dioxide to react with ozone and water vapour to make sulphuric acid.
The clusters of interest for cloud formation consist mainly of sulphuric acid and water molecules
clumped together in very large numbers and they grow with the aid of other molecules.

Atmospheric chemists have assumed that when the clusters have gathered up the day's yield, they stop growing,
and only a small fraction can become large enough to be meteorologically relevant.
Yet in the SKY2 experiment, with natural cosmic rays and gamma-rays keeping the air in the chamber ionized, no such interruption occurs.
This result suggests that another chemical process seems to be supplying the extra molecules needed to keep the clusters growing.

"The result boosts our theory that cosmic rays coming from the Galaxy are directly involved in the Earth's weather and climate",
says Henrik Svensmark, lead author of the new report.
"In experiments over many years, we have shown that ionizing rays help to form small molecular clusters.
Critics have argued that the clusters cannot grow large enough to affect cloud formation significantly.
But our current research, of which the reported SKY2 experiment forms just one part, contradicts their conventional view.
Now we want to close in on the details of the unexpected chemistry occurring in the air,
at the end of the long journey that brought the cosmic rays here from exploded stars."

Simulating what could happen in the atmosphere,
the DTU's SKY2 experiment shows molecular clusters (red dots)
failing to grow enough to provide significant numbers of "cloud condensation nuclei" (CCN) of more than 50 nanometres in diameter.
This is what existing theories predict.
But when the air in the chamber is exposed to ionizing rays that simulate the effect of cosmic rays,
the clusters (blue dots) grow much more vigorously to the sizes suitable for helping water droplets to form and make clouds.
(A nanometre is a millionth of a millimetre)

Prof. Svensmark and his team are in the Center for Sun-Climate Research at the Danish National Space Institute, DTU Space.
His co-authors are Martin B. Enghoff and Jens Olaf Pepke Pedersen.
In their paper they acknowledge important theoretical contributions to this line of research,
notably from Nicolai Bork of the University of Helsinki.

"It's not even 11 years", says Guhathakurtha.
"The cycle ranges in length from 9 to 12 years.
Some cycles are intense, with many sunspots and solar flares; others are mild, with relatively little solar activity.
In the 17th century, during a period called the 'Maunder Minimum',
the cycle appeared to stop altogether for about 70 years and no one knows why."

There is no need to go so far back in time, however, to find an example of the cycle's unpredictability.
Right now the sun is climbing out of a century-class solar minimum that almost no one anticipated.

"The depth of the solar minimum in 2008-2009 really took us by surprise",
says sunspot expert David Hathaway of the Marshall Space Flight Center in Huntsville, Alabama.
"It highlights how far we still have to go to successfully forecast solar activity."

Astronomers were once so convinced of the Sun's constancy, they called the irradiance of the sun "the solar constant",
and they set out to measure it as they would any constant of Nature.
By definition, the solar constant is
the amount of solar energy deposited at the top of Earth's atmosphere in units of watts per meter-squared.
All wavelengths of radiation are included - radio, infrared, visible light, ultraviolet, x-rays and so on.
The approximate value of the solar constant is 1,361 W/m2.

"The 'Solar constant' is an oxymoron", says Judith Lean of the Naval Research Lab.
"Satellite data show that the Sun's total irradiance rises and falls with the sunspot cycle by a significant amount."

At solar maximum, the sun is about 0.1% brighter than it is at solar minimum.
That may not sound like much, but consider the following: A 0.1% change in 1,361 W/m2 equals 1.4 Watts/m2.
Averaging this number over the spherical Earth and correcting for Earth's reflectivity
yields 0.24 Watts for every square meter of our planet.

"Add it all up and you get a lot of energy", says Lean.
"How this might affect weather and climate is a matter of - at times passionate - debate."

"The Earth's weather and climate regime is determined by the total solar irradiance (TSI) and its interactions with the Earth's atmosphere,
oceans and landmasses.
Evidence from both 33 years of direct satellite monitoring and historical proxy data leaves no doubt that solar luminosity in general,
and TSI in particular, are intrinsically variable phenomena.
Subtle variations of TSI resulting from periodic changes in the Earth's orbit (Milankovitch cycles: ~20, 40 and 100 Kyrs)
cause climate change ranging from major ice ages to the present inter-glacial,
clearly demonstrating the dominance of TSI in climate change on long timescales.
TSI monitoring, cosmogenic isotope analyses and correlative climate data indicate that
variations of the TSI have been a significant climate forcing during the current inter-glacial period (the last ~10 Kyrs).
Phenomenological analyses of satellite TSI monitoring results,
TSI proxies during the past 400 years and the records of surface temperature
show that TSI variation has been the dominant forcing for climate change during the industrial era.
The periodic character of the TSI record indicates that
solar forcing of climate change will likely be the dominant variable contributor to climate change in the future."

"Monitoring TSI variability is clearly an important component of climate change research,
particularly in the context of understanding the relative forcings of natural and anthropogenic processes.
The requirements for a long-term, climate TSI database
can be inferred from a recent National Research Council study
which concluded that gradual variations in solar luminosity of as little as 0.25%
was the likely forcing for the 'little ice age' that persisted in varying degree from the late 14th to the mid 19th centuries.
A centuries-long TSI database will have to be calibrated by either precision or accuracy to a small fraction of this value
to be of any use in assessing the magnitude of solar forcing."

Habibullo Abdussamatov, Dr. Sc.
Head of Space Research Laboratory of the Pulkovo Observatory,
Head of the Russian/Ukrainian Joint Project Astrometria

Experts of the United Nations in regular reports publish data said to show that the Earth is approaching a catastrophic global warming,
caused by increasing emissions of carbon dioxide to the atmosphere.
However, observations of the Sun show that as for the increase in temperature,
carbon dioxide is "not guilty" and as for what lies ahead in the upcoming decades,
it is not catastrophic warming, but a global, and very prolonged, temperature drop.

Life on earth completely depends on Solar radiation, the ultimate source of energy for natural processes.
For a long time it was thought that the luminosity of the Sun never changes,
and for this reason the quantity of Solar energy received per second over one square meter above the atmosphere
at the distance of the Earth from the Sun (149,597,892 km), was named the Solar Constant.

Until 1978, precise measurements of the value of the Total Solar Irradiance (TSI) were not available.
But according to indirect data, namely the established major climate variations of the Earth in recent millennia,
one must doubt the invariance of its value.

In the middle of the nineteenth century,
German and Swiss astronomers Heinrich Schwabe and Rudolf Wolf
established that the number of spots on the surface of the Sun periodically changes,
diminishing from a maximum to a minimum, and then growing again, over a time frame on the order of 11 years.
Wolf introduced an index ("W") of the relative number of sunspots,
computed as the sum of 10 times number of sunspot groups plus the total number of spots in all groups.
This number has been regularly measured since 1849.
Drawing on the work of professional astronomers and the observations of amateurs (which are of uncertain reliability)
Wolf worked out a reconstruction of monthly values from 1749 as well as annual values from 1700.
Today, the reconstruction of this time series stretches back to 1611.
It has an eleven-year cycle of recurrence as well as other cycles related to onset and development of individual sunspot groups:
changes in the fraction of the solar surface occupied by faculae,
the frequency of prominences, and other phenomena in the solar chromosphere and corona.

Analyzing data on solar activity,
the American astrophysicist John Eddy in 1976 noted a correlation
between periods of significant change in the number of spots in the past millennium and large changes in the climate of the Earth,
changes that have profoundly influenced the life of peoples and states, initiating economic and demographic crises.
Later, St. Petersburg geophysicist Eugene Borisenkov showed (1988)
that in each of 18 deep minima of solar activity of the Maunder Minimum type,
minima which have occurred about every 200 years for the last 7,500 years,
there have been periods of deep temperature decline,
while in the periods of high sunspot maxima, there have been periods of global warming.
Such changes in the climate of the Earth could be caused only by lasting and significant changes in the Sun,
because there was absolutely no industrial effect on nature in those times.
This supports the idea that in the bicentennial periods of maximum levels of solar activity,
the TSI has always substantially increased, and it has noticeably decreased in periods of minima.

Thus, not 11-year, but bicentennial cycles of solar variation are the dominant factor in climate variations that last for decades:
temperatures in the ocean-atmosphere system, the physical parameters of the Earth's surface and its albedo,
concentrations of greenhouse gases (primarily water vapor and carbon dioxide) in the atmosphere.
Also, a quite important influence on climate is exerted by the world ocean,
which possesses large thermal inertia and serves as the principal receiver and storage of solar energy.

A global increase in temperature has also occurred on Mars.
NASA researchers, after tracing changes on its surface from 1999 until 2005,
discovered melting ice at Mars' South Pole and warming of the Martian climate,
a natural event that occurred without any contribution by Martians or greenhouse effect driven by Martians.
Analogous processes have also been observed on Jupiter, Neptune, Triton, Pluto and other planets of the solar system.
These can only be the direct consequences of the action of one and the same factor
- the prolonged and extraordinarily high level of the energy radiated by the Sun.

Warming on Mars did not occur as a result of change in the shape of its orbit and inclination of its axis of rotation,
as is frequently asserted:
these processes occur on time frames of tens of thousands of years,
and therefore in this negligible time frame (six years!) in no way could they affect the climate.

Published in the Russian journal "Nauka i Zhizn" ("Science and Life"), 2009, N1, pp. 34-42.

Bicentennial Decrease of the Total Solar Irradiance Leads to Unbalanced Thermal Budget of the Earth and the Little Ice Age:

Habibullo Abdussamatov, Dr. Sc.
Head of Space Research Laboratory of the Pulkovo Observatory,
Head of the Russian/Ukrainian Joint Project Astrometria

Temporal changes in the power of the longwave radiation of the system Earth-atmosphere emitted to space
always lag behind changes in the power of absorbed solar radiation due to slow change of its enthalpy.
That is why the debit and credit parts of the average annual energy budget of the terrestrial globe with its air and water envelope
are practically always in an unbalanced state.
Average annual balance of the thermal budget of the system Earth-atmosphere during long time period
will reliably determine the course and value of both an energy excess accumulated by the Earth
or the energy deficit in the thermal budget which, with account for data of the TSI forecast,
can define and predict well in advance the direction and amplitude of the forthcoming climate changes.
From early 90s we observe bicentennial decrease in both the TSI and the portion of its energy absorbed by the Earth.
The Earth as a planet will henceforward have negative balance in the energy budget
which will result in the temperature drop in approximately 2014.
Due to increase of albedo and decrease of the greenhouse gases atmospheric concentration
the absorbed portion of solar energy and the influence of the greenhouse effect will additionally decline.
The influence of the consecutive chain of feedback effects which can lead to additional drop of temperature
will surpass the influence of the TSI decrease.
The onset of the deep bicentennial minimum of TSI is expected in 2042±11,
that of the 19th Little Ice Age in the past 7,500 years - in 2055±11.

From early 1990s the values of both eleven-year and bicentennial components of TSI variations
are decreasing at accelerating (at present) rate (Fig. 2),
and hence a fraction of TSI absorbed by the Earth is declining at practically the same rate.

Also see the International Space Station's Russian-Ukrainian
"ASTROMETRIA" project
(Measurement of temporary variations of the shape and diameter of the Sun and the total solar irradiance. Pulkovo Observatory)

Grand Minimum of the Total Solar Irradiance Leads to the Little Ice Age
Habibullo Abdussamatov. November 25, 2013

Significant climate variations during the past 7.5 millennia indicate that
bicentennial quasi-periodic TSI variations define a corresponding cyclic mechanism of climatic changes from global warmings to Little Ice Ages
and set the timescales of practically all physical processes taking place in the Sun-Earth system.
Quasi-bicentennial cyclic variations of the TSI entering the Earth's upper atmosphere
are the main fundamental cause of corresponding alternations of climate variations.
At the same time, more long-term variations of the annual average of the TSI due to changes in the shape of the Earth's orbit,
inclination of the Earth's axis relative to its orbital plane, and precession, known as the astronomical Milankovitch cycles,
together with the subsequent feedback effects, lead to the Big Glacial Periods (with the period of about 100,000 years).

Thus quasi-bicentennial variation of the TSI always leads to the unbalance of the annual average energy budget of the Earth-atmosphere system,
while upcoming Grand minimum of the TSI leads to deficit of the annual average energy budget of the Earth and the Little Ice Age.

Since the early 1990s one has observed a decrease in both the TSI and hence the portion of energy absorbed by the Earth (Figure 1).
Since the Sun is in the phase of decline of the quasi-bicentennial variation,
an average annual decrease of the smoothed absolute value of TSI from the 22nd cycle to the 23rd and 24th cycles is increasing.

Figure 1. Variations of both the TSI and solar activity in 1978-2013 and prognoses of these variations to cycles 24-27 until 2045.
The arrow indicates the beginning of the new Little Ice Age epoch after the maximum of cycle 24.

The observed trend of the increasing rate of an average annual decline in the absolute value of TSI
allows us to suggest that this decline as a whole will correspond to the analogous TSI decline in the period of Maunder minimum
according to its most reliable reconstruction. (Shapiro A.I. et al., 2011)
Let us note that the level of maximum of the 11-year component of TSI has decreased within five years of the 24th cycle by ~0.7 Wm-2
with respect to the maximum level of the 23rd cycle.
The Earth as a planet will have also a negative balance in the energy budget in the future,
because the Sun has entered the decline phase of the quasibicentennial cycle of the TSI variations.
This will lead to a drop in the temperature and to the beginning of the epoch of the Little Ice Age
approximately after the maximum of solar cycle 24 since the year 2014.

Thus, the quasi-bicentennial variations of the TSI
(allowing for their direct and secondary impacts, with the latter being due to the secondary feedback effects)
are the major and essential cause of climate changes.
The Sun is the main factor controlling the climatic system
and even non-significant long-term TSI variations may have serious consequences for the climate of the Earth and other planets of the Solar system.
Quasi-bicentennial solar cycles are the key to understanding cyclic changes in both the nature and the society.
The sign and value of the energy imbalance in the Earth-atmosphere system over a long time span
(excess of incoming TSI accumulated by the Ocean, or its deficiency)
determine a corresponding change of the energy state of the system and, hence, a forthcoming climate variation and its amplitude.
That is why the Earth's climate will change every 200±70 years; and it is the result of bicentennial cyclic TSI variation.

Also see the International Space Station's Russian-Ukrainian
"ASTROMETRIA" project
(Measurement of temporary variations of the shape and diameter of the Sun and the total solar irradiance. Pulkovo Observatory)

Note:
For Solar Cycle 24, a first maximum number of sunspots occurred in November 2011 (Ri=96.7).
A second, higher, maximum occurred in February 2014 (Ri=102.3).
No statistically significant global warming, or cooling, has taken place since 1998 (see the UAH Satellite-Based Global Temperature Record).
[March 29, 2015]

Dr. David Evans - The Notch-Delay Solar Theory, 2014

There are three big drops in solar radiation in the 400 years of records.
The first, in the 1600s, led to the Maunder Minimum, the coldest time in the last 400 years.
The second in Napoleon's time, led to the Dalton Minimum, the second coldest time in the last 400 years.
The third started in 2004, but hasn't led to cooling...yet.
The notch-delay theory says that the fall in TSI signals a fall in force X which acts after a delay, which seems to be 11 years.
So the fall will occur in 2004 + 11 = 2015. But the delay is tied to the solar cycle length, currently 13 years,
so the cooling is more likely to start in 2004 + 13 = 2017.
The cooling will be at least 0.2°C, maybe 0.5°C, enough to undo global warming back to the 1950s.
The carbon dioxide and Notch-Delay solar theories have been in agreement over the last century due to generally rising carbon dioxide and solar radiation,
but now they sharply diverge. Only one of them can be correct, and soon we'll know which one.
Here's the criterion: A fall of at least 0.1°C (on a 1-year smoothed basis) in global average surface air temperature over the next decade.
If the criterion does not occur then the Notch-Delay solar theory is rubbish and should be thrown away.
If it does occur then the carbon dioxide theory is rubbish, and should be thrown away.

Figure 2.1. This instance of the notch-delay solar model used a constant delay of 10.7 years and shows cooling beginning in 2014.

BIG NEWS VIII: New solar theory predicts imminent global cooling

To recap - using an optimal Fourier Transform,
David Evans discovered a form of notch filter operating between changes in sunlight and temperatures on Earth.
This means there must be a delay - probably around 11 years.
This not only fitted with the length of the solar dynamo cycle,
but also with previous independent work suggesting a lag of ten years or a correlation with the solar activity of the previous cycle.
The synopsis then is that solar irradiance (TSI) is a leading indicator of some other effect coming from the Sun after a delay of 11 years or so.

The discovery of this delay is a major clue about the direction of our future climate.
The flickers in sunlight run a whole sunspot cycle ahead of some other force from the Sun.
Knowing that solar irradiance dropped suddenly from 2003 onwards tells us the rough timing of the fall in temperature that's coming
(just add a solar cycle length).
What it doesn't tell us is the amplitude - the size of the fall.
That's where the model may (or may not) tell us what we want to know. That test is coming, and very soon.
This is an unusual time in the last 100 years where the forecasts from the CO2 driven models and the solar model diverge sharply.

Published by Science Speak (Dr. David Evans, Joanne Nova. Perth, Western Australia)

The Croll-Milankovitch Cycles:

"The Earth's orbit around the Sun is not quite circular, which means that the Earth is slightly closer to the Sun at some times of the year than others.
The closest approach of the Earth to the Sun is called perihelion, and it now occurs in January, making northern hemisphere winters slightly milder.
This change in timing of perihelion is known as the precession of the equinoxes, and occurs on a period of 22,000 years.
11,000 years ago, perihelion occurred in July, making the seasons more severe than today.
The "roundness", or eccentricity, of the Earth's orbit varies on cycles of 100,000 and 400,000 years,
and this affects how important the timing of perihelion is to the strength of the seasons.
The combination of the 41,000 year tilt cycle and the 22,000 year precession cycles, plus the smaller eccentricity signal,
affect the relative severity of summer and winter, and are thought to control the growth and retreat of ice sheets.
Cool summers in the northern hemisphere, where most of the earth's land mass is located,
appear to allow snow and ice to persist to the next winter, allowing the development of large ice sheets over hundreds to thousands of years.
Conversely, warmer summers shrink ice sheets by melting more ice than the amount accumulating during the winter."

"Variations in the intensity and timing of heat from the Sun are the most likely cause of glacial/interglacial cycles.
This variability is partially driven by changes in the Sun's output,
but is affected more strongly by variations in Earth's orbit."

"There are three major components of Earth's orbit about the Sun that contribute to changes in our climate.
These are, the Precession of the Equinoxes, and changes in Axial Obliquity and Orbital Eccentricity.
The full cycle of equinox precession takes 25,800 years to complete.
Presently, Earth is closest to the Sun [perihelion] in January and farther away in July [aphelion].
Presently Earth's tilt is 23.5°, but the 41,000 year cycle varies from 22.1° to 24.5°.
Earth's orbit goes from measurably elliptical to nearly circular in a cycle that takes around 100,000 years."

"Individually, each of the three cycles affect insolation patterns.
When taken together, they can partially cancel or reinforce each other in complicated ways."

"The Serbian astrophysicist Milutin Milankovitch (1879-1958)
is best known for developing one of the most significant theories relating Earth motions and long-term climate change."

"Changes in orbital eccentricity affect the Earth-Sun distance.
Currently, a difference of only 3 percent (5 million kilometers) exists between closest approach (perihelion), which occurs on or about January 3,
and furthest departure (aphelion), which occurs on or about July 4.
This difference in distance amounts to about a 6 percent increase in incoming solar radiation (insolation) from July to January.
The shape of the Earth's orbit changes from being elliptical (high eccentricity) to being nearly circular (low eccentricity)
in a cycle that takes between 90,000 and 100,000 years.
When the orbit is highly elliptical,
the amount of insolation received at perihelion would be on the order of 20 to 30 percent greater than at aphelion,
resulting in a substantially different climate from what we experience today."

Atmospheric Circulation:

ITCZ, Pressure and Wind at Sea Level:

The Inter-Tropical Convergence Zone (ITCZ) is identified on the figures by a red line.
The formation of this band of low pressure is the result of solar heating and the convergence of the trade winds.
In January, the intertropical convergence zone is found south of the equator.
During this time period, the Southern Hemisphere is tilted towards the Sun and receives higher inputs of shortwave radiation.
Note that the line representing the intertropical convergence zone is not straight and parallel to the lines of latitude.
Bends in the line occur because of the different heating characteristics of land and water.
Over the continents of Africa, South America, and Australia, these bends are toward the South Pole.
This phenomenon occurs because land heats up faster than ocean.

The graphics show the center of the ITCZ (red line) and the atmospheric pressure (colors),
velocity and direction at sea level (black arrows), in January and July (1959-1997 average).

During July, the intertropical convergence zone (ITCZ) is generally found north of the equator.
This shift in position occurs because the altitude of the Sun is now higher in the Northern Hemisphere.
The greatest spatial shift in the ITCZ, from January to July, occurs in the eastern half of the image.
This shift is about 40° of latitude in some places.
The more intense July Sun causes land areas of Northern Africa and Asia rapidly warm creating the Asiatic Low
which becomes part of the ITCZ.
In the winter months,
the intertropical convergence zone is pushed south by the development of an intense high pressure system over central Asia.
The extreme movement of the ITCZ in this part of the world also helps to intensify the development
of a regional winds system called the Asian Monsoon.

Beginning June 1, 2011, the Tropical Analysis and Forecast Branch (TAFB) will officially include,
as part of its portion of the unified surface analyses (USA),
a distinction between the trade wind Intertropical Convergence Zone (hereafter ITCZ)
and the monsoon trough ITCZ (hereafter monsoon trough).
A second addition to the TAFB portion of the USA will be the depiction of shear lines.

Depiction of the Monsoon Trough on the Tropical Analysis & Forecast Branch (TAFB) portion of the Unified Surface Analysis

The decision to differentiate between the ITCZ and monsoon trough arises from the differences in wind direction
and its implication for the tropical cyclogenesis for each feature.
TAFB's definition of each feature follows:

Monsoon Trough - the portion of the ITCZ which extends into or through a monsoon circulation,
as depicted by a line on a weather map showing the location of minimum sea level pressure.
This line coincides with the maximum cyclonic curvature vorticity,
with southwesterly (SW) monsoonal flow prevailing south of the trough axis.

Implication for users of the TAFB surface analysis: users may anticipate SW winds to the south of the monsoon trough,
and SE winds to the south of the ITCZ.

Implication for tropical cyclogenesis: the convergence of SW winds south of the monsoon trough and NE winds north of the monsoon trough
creates a background flow that produces cyclonic vorticity, which is important for tropical cyclogenesis.
The ITCZ creates a confluence zone of NE trade wind flow and SE trade wind flow, which does not readily create cyclonic vorticity.
Thus, tropical cyclogenesis is more likely in a background flow associated with a monsoon trough than the ITCZ.

The Troposphere is the lowest layer of the atmosphere, from the ground to the tropopause.
It goes from some 20 Km in the equatorial regions to near 7 Km at the poles when in winter.
In the temperate zones of the Earth it averages some 17 Km.
In the troposphere are 80% of the mass and 99% of the water vapor and particles of the atmosphere.
Tropos means change in Greek; in the troposphere is where most of the weather occurs.
The composition of the atmosphere is very uniform but for the water, vapor and ice distribution.
All this water is evaporated at the surface of continents and oceans.
The tropopause is the boundary region between the troposphere and the stratosphere.
There is little mixing between these two layers.
In the troposphere layer temperature decreases with altitude (positive lapse rate, usually 6°C/Km),
from an average of 15°C at sea level to about -55°C at the top of the tropopause.
In the stratosphere layer the temperature at first remains near constant, then increases with altitude (negative lapse rate),
this defines the height of the tropopause.

The standard ground atmospheric pressure at the equator (1,013.25 hPa = 760 mmHg) results from the weight of the air above.
Local pressure decreases with temperature, elevation and latitude. Local pressure increases with humidity in the air above.
Low local pressures are normally associated with faster winds, clouds, precipitation and storms.
High local pressures are normally associated with dry weather and mostly clear skies,
with larger diurnal temperature changes and light winds.

The seasonal/annual cycle, forced by the annual cycle of the Sun due to the rotation of the Earth about the Sun.

Interannual or year-to-year variability, such as El Niño.

Accurate forecasting of this variability will benefit people living in the tropical regions,
and also over the rest of the Earth due to remote 'teleconnections'
between the weather in the tropics and the weather elsewhere around the globe.
Here, we focus on variability on the intraseasonal time scale, which is dominated by the Madden-Julian oscillation (MJO).
This was discovered by Madden and Julian (1971, 1972) who called it the '40-50-day oscillation' because of its preferred time scale.
Since then it has been called the '30-60-day oscillation' and the 'intraseasonal oscillation',
but the term 'MJO' has now emerged as a favorite.

The MJO is characterized by an eastward propagation of rainfall over the 'warm pool' region from the Indian Ocean to the western Pacific.

In addition to strongly modulating the rainfall in the tropics, the MJO has a signal in other meteorological variables.
For example, a clear MJO cycle in sea level pressure can also be seen.

The negative pressure anomalies appear to emanate out of the region of enhanced rainfall.
One signal propagates eastward along the equator. This is an equatorial Kelvin wave.
When it reaches the Andes mountain range along the eastern coast of the Pacific it is momentarily blocked,
before continuing on eastward across the Atlantic, completing a circuit of the equator in one MJO cycle, about 48 days.

An equatorial Rossby wave signal is also forced by the MJO rainfall anomalies.
This can be seen as a pair of negative sea level pressure anomalies, one either side of the equator,
that lie slightly to the west of the enhanced rainfall.

In the 'other half' of the MJO cycle,
the reduced rainfall triggers equatorial Kelvin and Rossby waves of the opposite sign (positive sea level pressure anomalies).

The MJO also affects other meteorological systems in the tropics:

Monsoons. The MJO modulates the active/break cycles that occur within the Asian and West African monsoons.

The Madden-Julian Oscillation (MJO) is a tropical weather system that lasts about 1 to 2 months.
It is one of the few aspects of the weather that can be skilfully predicted beyond about 2 weeks into the future.

The Atlantic Multi-decadal Oscillation (AMO) is an ongoing series of long-duration changes
in the sea surface temperature of the North Atlantic Ocean,
with cool and warm phases that may last for 20-40 years at a time and a difference of about 1°F [0.56°C] between extremes.
These changes are natural and have been occurring for at least the last 1,000 years.

Atlantic Multi-decadal Oscillation (AMO) Index 1856 to January 23, 2016. Averaged January to December

Is the AMO a natural phenomenon, or is it related to global warming?
Instruments have observed AMO cycles only for the last 150 years, not long enough to conclusively answer this question.
However, studies of paleoclimate proxies, such as tree rings and ice cores,
have shown that oscillations similar to those observed instrumentally have been occurring for at least the last millennium.
This is clearly longer than modern man has been affecting climate, so the AMO is probably a natural climate oscillation.
In the 20th century, the climate swings of the AMO have alternately camouflaged and exaggerated the effects of global warming,
and made attribution of global warming more difficult to ascertain.

The Atlantic Multidecadal Oscillation is a recently discovered mode of Sea Surface Temperature variability
for a significant portion of the global oceans.
Climate studies provide different causes for the additional strength of the changes in North Atlantic SST anomalies:
some blame the Atlantic branch of Thermohaline Circulation,
while another discusses the multiple interactions between Saharan dust, Sahel precipitation, solar radiation,
and Atlantic Sea Surface Temperature.
While cause may be debatable, its impact on Northern Hemisphere sea surface and land surface temperature is clear.

Foltz and McPhaden (2008) write in their Abstract,
"Trends in tropical Atlantic sea surface temperature (SST), Sahel rainfall, and Saharan dust are investigated during 1980-2006.
This period is characterized by a significant increase in tropical North Atlantic SST
and the transition from a negative to a positive phase of the Atlantic Multidecadal Oscillation (AMO).
It is found that dust concentrations over western Africa and the tropical North Atlantic Ocean
decreased significantly between 1980 and 2006 in association with an increase in Sahel rainfall.
The decrease in dust in the tropical North Atlantic tended to increase the surface radiative heat flux by 0.7 W/m^2 which, if unbalanced,
would lead to an increase in SST of 3 deg C.
Coupled models significantly underestimate the amplitude of the AMO in the tropical North Atlantic
possibly because they do not account for changes in Saharan dust concentration."

El Niño and La Niña are natural oscillations of the ocean-atmosphere system in the tropical Pacific
that have important consequences for weather around the globe.
Current science can detect them, but not predict them in the long term.
They are part of a phenomenon known as El Niño-Southern Oscillation (ENSO),
a continual but irregular cycle (of about 3 to 7 years) of shifts in ocean and atmospheric conditions that affect the global climate.
El Niño is characterized by unusually warm ocean temperatures in the Equatorial Pacific,
as opposed to La Niña, which is characterized by unusually cold ocean temperatures in the Equatorial Pacific.
Among these consequences is increased rainfall across the southern tier of the US and in Peru, which has caused destructive flooding,
and drought in the West Pacific, sometimes associated with devastating brush fires in Australia.
El Niño events tend to suppress Atlantic hurricane activity, while La Niña events tend to enhance it.

The recent change from stronger El Nino to stronger La Nina conditions is revealed in monthly Multivariate ENSO Index (MEI) data since 1950
... which is also related to the Pacific Decadal Oscillation (PDO),
some researchers consider the PDO to be a low-frequency modulation of El Nino and La Nina activity.

Of significance to the current 'global warming hiatus' issue is the observation that we might have now entered
into a new La Niña-dominant phase.
... such a scenario could well lead to a 25- or 30-year period of no warming - or even cooling -
just as was experienced up until the 1970s.

"The Southern Oscillation was discovered decades before it was found to be related to El Niño and La Niña events,
which are not repetitive in time, so they are not parts of a true oscillation.
While there are portions of El Niño and La Niña processes that behave as cycles, those cycles break down,
and an El Niño or a La Niña can evolve as an independent event.
Further, El Niño and La Niña are not opposites.
That's also very obvious in the sea surface temperature records.
La Niña is an exaggeration of the normal state of the tropical Pacific, while an El Niño is the anomalous phase.
That's why many researchers believe there are only two states of the tropical Pacific: El Niño and 'other'.
Also, over the last 30 years it's rare when a La Niña has been as strong as the El Niño that preceded it.
How then could a La Niña counteract an El Niño?
Of course, the temperature records also show a multidecadal period when La Niña were as strong as El Niño,
and it's no coincidence that global surface temperature did not warm during it."

"A very strong El Niño like the one in 1997/98
is capable of temporarily raising global surface temperatures more than 0.4 deg C (about 0.7 deg F) over a 12-month period,
and for some reason, many climate scientists claim such an event has no long-term aftereffects.
This means those scientists have failed to account for the warm water that is redistributed after a strong El Niño
and for the effects those leftover warm waters have on global climate."

"An El Niño and his sibling La Niña can cause flooding in some parts of the world,
droughts in others - blizzards in some areas, record low snowfalls elsewhere.
The strong storms they produce erode coastlines.
They can suppress the development of tropical cyclones (hurricanes) in some parts of the globe
and enhance the conditions for their development in others.
It should go without saying that they cause heat waves and cold spells depending on the season and location.
These causes and effects have been known for decades.
Recently, however, a few headline-seizing climate scientists, with the help of mainstream media and blogs,
have now redirected the blame for those weather events to carbon dioxide and other greenhouse gases."

"The IPCC uses climate model simulations of global surface temperatures
with and without radiative forcings from manmade greenhouse gases
to show that the warming of global surface temperatures for the past three decades
could only be simulated by the models that included anthropogenic greenhouse gases.
For the IPCC, this provided irrefutable proof that greenhouse gases were responsible for the warming.
To the general public, however, it suggested another possibility.
If climate models without radiative forcings from greenhouse gas couldn't simulate the warming,
then those assumption-based climate models might be seriously flawed.
This book, using the outputs of the climate models used by the IPCC, confirms that they are in fact flawed.
Climate models show no skill whatsoever at being able to simulate the ocean processes
that produced the warming of global sea surface temperatures for the past 3 decades."

"Maybe the IPCC should examine the sea surface temperature records for the past 30 years.
Why? They do not agree with the IPCC's conclusions.
Satellite-based sea surface temperature records show
El Niño and La Niña are responsible for most of the warming of global sea surface temperatures over the past 3 decades.
That fact shows up plain as day in sea surface temperature records.
It's tough to miss. It really is. Maybe the IPCC has overlooked it intentionally."

"Who Turned on the Heat? uses observations-based data, not climate models,
to illustrate where and how ENSO is capable of raising global sea surface temperatures over periods of 10, 20, 30 years and more.
Because land surface air temperatures are basically along for the ride, mimicking the variations in sea surface temperatures,
ENSO can be said to be responsible for most of the warming of global land plus sea surface temperatures for the past three decades as well."

"El Niño and La Niña events are often described as the 'unusual' warming (El Niño) and cooling (La Niña)
of the surface of the eastern tropical Pacific Ocean.
They happen every couple of years, so there's really nothing unusual about them.
In fact, based on the NOAA's Oceanic NINO Index (ONI),
official El Niño and La Niña months occurred about 55% of the time since 1950.
Also, scientists who study historical changes in climate (paleoclimatologists)
have presented evidence that El Niño and La Niña events were occurring 3 to 5 million years ago.
In other words, not only do El Niño and La Niña events occur often, they've been around a long, long time."

"El Niño and La Niña are siblings, Mother Natures' mischievous but mighty children.
Contrary to popular beliefs, they do not counteract one another.
This is also plainly evident in sea surface temperature data.
Further, El Niño is usually more powerful than his sister.
On the other hand, La Niña can endure for as long as three years, while the stronger El Niño normally lasts for less than one year.
Look out, though, when they both decide to test themselves as strong events in sequence,
wrestling with global surface temperatures as a tag team.
Together they can cause global surface temperatures to shift upwards for a decade,
until they act together again as a team and cause another persistent change in surface temperatures around the globe.
This happens because of some not-so-subtle differences between La Niña and El Niño phases,
a fact that is very apparent once you understand those phases."

"The IPCC's climate models are allegedly used to determine the causes of the past warming and cooling of global surface temperatures,
and they are employed to project global surface temperatures into the future based on a number of assumptions.
Here's a simple but realistic way to look at the climate models:
Climate models show how surface temperatures would warm IF they were warmed by manmade greenhouse gases.
The truth is, the Earth's oceans do not respond to manmade greenhouse gases as the modelers have assumed.
The sea surface temperature records show
the global oceans could care less about a little back radiation from anthropogenic greenhouse gases.
While global sea surface temperatures have definitely warmed over the past 3 decades,
there is no indication that additional infrared radiation from increased concentrations of carbon dioxide caused the warming."

"Examples of climate model problems: Most of the climate models used by the IPCC in their 2007 4th Assessment Report (AR4),
in addition to the failings already discussed,
have multiple flaws with how they simulate the natural processes taking place in the tropical Pacific.
They have difficulties simulating precipitation, cloud cover, downward shortwave radiation, trade wind speeds and location, etc.,
which are all interrelated and associated with El Niño-Southern Oscillation.
Climate models tend to make La Niña events as strong as El Niño events,
while in the real world, starting in the late 1970s, El Niño events have tended to be stronger than La Niña events.
Recently, though, they've been working their way back to a regime when El Niño and La Niña are more equally weighted.
It is well known that El Niño and La Niña events are tied to the seasonal cycle with both phases peaking around December,
but this is not the case in all climate models."

"The sea surface temperature and ocean heat content data for the past 30 years show the global oceans have warmed.
There is no evidence, however, that the warming was caused by anthropogenic greenhouse gases in part or in whole;
that is, the warming can be explained by natural ocean-atmosphere processes, primarily ENSO."

The Free Preview of Climate Models Fail [.pdf]
includes the Introduction, Table of Contents, and the Closing.
As you'll note from the Table of Contents, the book includes many of the model-data comparisons I published as blog posts over the past year.
The text accompanying them has been rewritten, expanded and edited for readability in this book.
And you'll note there are brand new presentations.

Climate Models Fail exposes the disturbing fact that climate models being used by the IPCC for their 5th Assessment Report
have very little practical value because they cannot simulate critical variables of interest to the public and policymakers.
Using easy-to-read graphs, this book compares data (surface temperature, precipitation, and sea ice area) with the computer model simulations.
It is very easy to see that the model outputs bear little relationship to the data.
In other words, climate models create imaginary climates in virtual worlds that exhibit no similarities to the climate of the world in which we live.

This book was prepared for readers without scientific backgrounds.
The terms used by scientists are explained and non-technical "translations" are provided.
Introductory sections present basics.
There are also numerous hyperlinks to additional background information.
The book is well illustrated, with more than 250 color-coded graphs and maps.
It is an excellent introduction to global warming and climate change for people who are not well-versed yet want to learn more.

Bob Tisdale - New Book: "On Global Warming and the Illusion of Control - Part 1":

On Global Warming and the Illusion of Control - Part 1 includes introductory discussions of 3 primary topics:

The science behind the groupthink of human-induced global warming and climate change,
climate models, and
even more importantly, many of the numerous known modes of natural variability.

Those fundamental presentations are in layperson terms, with links to more-detailed discussions and peer-reviewed papers.

When you first download the ebook, you'll note it's over 700 pages long.
Some of you are going to say to yourselves, I'll never read a 700-page book about global warming and climate change.
I'm not expecting that everyone will.
The next thing you might note is that the interactive Table of Contents lists more than 60 chapters.

Those of you who are new to global warming and climate change might want to start with:

Simply click on the titles of those chapters in the Table of Contents, and Adobe Acrobat Reader will fast-forward you there.

Since they're in the news, others of you might be interested in El Niño events, and are wondering about the processes behind them.
Simply click on Chapter 3.7 - Ocean Mode: El Niño and La Niña.

The Introduction covers a multitude of topics, from the slowdown in global warming to examples of very basic climate model failings;
from the political, not scientific, nature of the Intergovernmental Panel on Climate Change
to global warming ranking low on peoples' priorities around the globe.

Why am I giving away a 700+ page book that took me almost 2 years to write?
The primary reason: Free, it should have a much-greater circulation.
Another reason: This is my way of saying thanks to everyone who has offered constructive comments on the threads of my blog posts
at WattsUpWithThat?
and at my blog ClimateObservations.
This book could not have been written without your insights.
Of course, I'm also hoping that many readers will find the two links to my tip jar that are found in the text.

To counter the nonsensical "Just what AGW predicts" rantings of alarmists about the "record-high" global sea surface temperatures in 2014,
I've added a model-data comparison of satellite-era global sea surface temperatures to these monthly updates. See the example below.
The models are represented by the multi-model ensemble-member mean of the climate models stored in the
CMIP5 archive, which was used by the IPCC for their 5th Assessment Report.
For further information on the use of the model mean, see the post
On the Use of the Multi-Model Mean.
For most models, historic forcings run through 2005 (2012 for others) and the middle-of-the-road RCP6.0 forcings
are used after in this comparison. The data are represented by NOAA's
Optimum Interpolation Sea Surface Temperature data, version 2
- a.k.a. Reynolds OI.v2 - which is NOAA's best.
The model outputs and data have been shifted so that their trend lines begin at "zero" anomaly
for the (November, 1981) start month of this dataset.
That "zeroing" helps to highlight how poorly the models simulate the warming of the ocean surfaces...
almost doubling the observed warming rate.
Both the Reynolds OI.v2 data and the model outputs of their simulations of sea surface temperature (TOS)
are available to the public at the KNMI Climate Explorer.

The Pacific Decadal Oscillation (PDO) is a natural long-term temperature fluctuation of the Pacific Ocean.
The PDO waxes and wanes approximately every 20 to 30 years,
has a dominant impact on hurricane variability in the Pacific and is probably influenced by the ENSO.

The Pacific Decadal Oscillation (PDO) is a long-lived El Niño-like pattern of Pacific climate variability.
While the two climate oscillations have similar spatial climate fingerprints, they have very different behavior in time.

Two main characteristics distinguish PDO from El Niño/Southern Oscillation (ENSO):
first, 20th century PDO "events" persisted for 20-to-30 years, while typical ENSO events persisted for 6 to 18 months;
second, the climatic fingerprints of the PDO are most visible in the North Pacific/North American sector,
while secondary signatures exist in the tropics - the opposite is true for ENSO.

Several independent studies find evidence for just two full PDO cycles in the past century:
"cool" PDO regimes prevailed from 1890-1924 and again from 1947-1976,
while "warm" PDO regimes dominated from 1925-1946 and from 1977 through (at least) the mid-1990's.
A "cool" PDO regime has prevailed after 1998.

Causes for the PDO are not currently known. Likewise, the potential predictability for this climate oscillation is not known.

"A simple climate model forced by satellite-observed changes in the Earth's radiative budget associated with the Pacific Decadal Oscillation
is shown to mimic the major features of global average temperature change during the 20th Century
- including three-quarters of the warming trend.
A mostly-natural source of global warming is also consistent with mounting observational evidence
that the climate system is much less sensitive to carbon dioxide emissions than the IPCC's climate models simulate."

"The PDO index represents the spatial pattern of the sea surface temperature anomalies in the extratropical North Pacific (20° N - 65° N)
... not the sea surface temperature anomalies themselves.
A strong positive PDO index value indicates the sea surface temperature anomalies of the eastern extratropical North Pacific
are warmer than the western and central portions, which is a spatial pattern created by El Niño events.
On the other hand,
a strong negative PDO index value indicates the sea surface temperature anomalies of the western and central portions
of the extratropical North Pacific are warmer than the eastern portion, and that's a spatial pattern created by La Niña events."
[See Figure 1]

"A cooling of the sea surface temperature anomalies of the western-central portion of the North Pacific can cause the PDO index to increase,
and a warming of the sea surface temperatures of the eastern North Pacific can also cause the PDO index to increase."

"A La Niña event in the tropical Pacific typically creates a spatial pattern in the extratropical North Pacific
where it's cooler in the eastern portion than it is in the western and central portions."
"An El Niño event creates the opposite spatial pattern,
where it's warmer in the eastern extratropical North Pacific and cooler in the western and central portions,
and that also relates to a "warm" PDO spatial pattern."
[See Figure 2]

"It is often said that the PDO pattern is the dominant spatial pattern in the extratropical North Pacific,
and that makes sense because the PDO pattern represents the El Niño- and La Niña-like pattern in the extratropical North Pacific
... and ... El Niños and La Niñas are the dominant mode of natural variability for the global oceans."

"The PDO data are not sea surface temperature data of the North Pacific.
The PDO data, on the other hand, are determined from the sea surface temperature data there,
using a statistical analysis called Principal Component Analysis. Note the distinction."

"It may be easiest to think of the PDO data in another way -
as representing how closely the spatial pattern in the North Pacific at any point in time
matches the spatial pattern created by La Niña and El Niño events.
If the spatial pattern closely matches the La Niña pattern in Figure 2, then the PDO index value would be negative.
The closer the match in the spatial pattern to one created by La Niña events, the greater the negative value.
And the opposite holds true for the El Niño-related spatial pattern.
The closer the resemblance to the El Niño pattern, the greater the positive PDO index value."
"The map on the right in Figure 2
presents a classic cool PDO pattern, which would be represented by a negative PDO index value."

The North Atlantic Oscillation (NAO) Index is based on the surface sea-level pressure difference between the Subtropical (Azores) High
and the Subpolar Low.
The positive phase of the NAO reflects below-normal heights and pressure across the high latitudes of the North Atlantic
and above-normal heights and pressure over the central North Atlantic, the eastern United States and western Europe.
The negative phase reflects an opposite pattern of height and pressure anomalies over these regions.
Both phases of the NAO are associated with basin-wide changes in the intensity and location of the North Atlantic jet stream and storm track,
and in large-scale modulations of the normal patterns of zonal and meridional heat and moisture transport,
which in turn results in changes in temperature and precipitation patterns often extending from eastern North America to western and central Europe.

Strong positive phases of the NAO tend to be associated with above-normal temperatures in the eastern United States and across northern Europe
and below-normal temperatures in Greenland and oftentimes across southern Europe and the Middle East.
They are also associated with above-normal precipitation over northern Europe and Scandinavia
and below-normal precipitation over southern and central Europe.
Opposite patterns of temperature and precipitation anomalies are typically observed during strong negative phases of the NAO.
During particularly prolonged periods dominated by one particular phase of the NAO,
abnormal height and temperature patterns are also often seen extending well into central Russia and north-central Siberia.
The NAO exhibits considerable interseasonal and interannual variability,
and prolonged periods (several months) of both positive and negative phases of the pattern are common.

"Near the end of each calendar year ocean surface temperatures warm along the coasts of Ecuador and northern Peru.
Local residents referred to this seasonal warming as "El Niño",
meaning The Child, due to its appearance around the Christmas season.
Every two to seven years a much stronger warming appears,
which is often accompanied by beneficial rainfall in the arid coastal regions of these two countries.
Over time the term "El Niño" began to be used in reference to these major
warm episodes."

"Wetter than normal conditions during warm episodes are observed along the west coast of tropical South America,
and at subtropical latitudes of North America (Gulf Coast) and South America (southern Brazil to central Argentina)."

"At times ocean surface temperatures in the equatorial Pacific are colder than normal.
These cold episodes,
sometimes referred to as "La Niña" episodes,
are characterized by lower than normal pressure over Indonesia and northern Australia
and higher than normal pressure over the eastern tropical Pacific.
This pressure pattern is associated with enhanced near-surface equatorial easterly winds over the central and eastern equatorial Pacific."

"Drier than normal conditions during cold episodes, are observed along the west coast of tropical South America,
and at subtropical latitudes of North America (Gulf Coast) and South America (southern Brazil to central Argentina)
during their respective winter seasons."

"During La Niña, rainfall and thunderstorm activity diminishes over the central equatorial Pacific,
and becomes confined to Indonesia and the western Pacific.
The area experiencing a reduction in rainfall generally coincides quite well with the area of abnormally cold ocean surface temperatures.
This overall pattern of rainfall departures spans nearly one-half the way around the globe,
and is responsible for many of the global weather impacts caused by La Niña."

"In the left-hand panel you can see the seasonal rainfall totals over the Pacific Ocean, the United States,
and South America during January-March 1989 when strong La Niña conditions were present.
The heaviest rainfall is shown by the darker green and blue colors, and lowest rainfall is shown by the lighter green colors.
The rainfall totals are shown in units of millimeters (mm).
Since 25.4 mm is equal to 1 inch of rain,
we see that the rainfall totals are more than 800 mm over the western tropical Pacific and Indonesia,
which is more than 31½ inches of rain."

"In the right-hand panel you can see the January-March 1989 seasonal rainfall departures from average for strong La Niña conditions.
The areas where the rainfall is well above average are shown by darker green colors,
and the areas where the rainfall is most below average are shown by the darker brown and yellow colors.
These rainfall departures are shown in units of 100 millimeters.
We see that rainfall totals were more than 200-400 mm above normal over the western tropical Pacific and Indonesia during the season,
which is roughly 8-16 inches above normal!
We also see well below-average rainfall across the central tropical Pacific,
where totals in some areas were more than 400 mm (15¾ inches) below normal."

La Niña:

El Niño:

"In the left-hand panel the seasonal rainfall totals during the strong El Niño conditions of January-March 1998
are shown for over the Pacific Ocean, the United States, and South America.
The heaviest rainfall [in units of millimeters (mm)] is shown by the darker green and blue colors,
and lowest rainfall is shown by the lighter green colors.
Since 25.4 mm is equal to one inch of rain, we see that the rainfall totals are more than 800 mm just south of the equator
along the International Date Line (indicated by the 180 label), which is more than 31½ inches of rain.
And nearly double the normal amount."

"In the right-hand panel the January-March 1998 seasonal rainfall departures from average are shown.
The areas with well above average rainfall are shown by darker green colors,
and the areas with well below-average rainfall are shown by the darker brown and yellow colors.
The rainfall departures are shown in units of 100 millimeters.
We see that the seasonal rainfall totals were more than 400 mm above normal just south of the equator
along the International Date Line (indicated by the 180 label), which is more than 15¾ inches above normal.
Considerable rainfall also occurred farther north (near 40°N) over the central and eastern North Pacific,
and across the western and southeastern United States.
These areas lie along the main wintertime storm track, which brings above-average rainfall to the western and southeastern United States."

"During El Niño, rainfall and thunderstorm activity diminishes over the western equatorial Pacific,
and increases over the eastern half of the tropical Pacific.
This area of increased rainfall occurs where the exceptionally warm ocean waters have reached about 28°C or 82°F.
This overall pattern of rainfall departures spans nearly one-half the distance around the globe,
and is responsible for many of the global weather impacts caused by El Niño."

"The climate models used by the IPCC for attribution studies and projections of future climate
cannot simulate the basic coupled ocean-atmosphere feedback in the tropical Pacific that underlies ENSO. It's called Bjerknes feedback.
It's the positive feedback relationship between the strength of the trade winds and the surface temperature gradient
(cooler in the east, warmer in the west) of the tropical Pacific.
Stronger trade winds yield a larger temperature gradient. And a larger temperature gradient yields stronger trade winds.
The two are interdependent, providing positive feedback to one another."

"Bjerknes feedback", very basically, means how the tropical Pacific and the atmosphere above it are coupled;
i.e., they are interdependent, a change in one causes a change in the other and they provide positive feedback to one another.
The existence of this positive "Bjerknes feedback" suggests that El Niño and La Niña events will remain locked in one mode
until something interrupts the positive feedback.

"El Niño/Southern Oscillation (ENSO) is the most important coupled ocean-atmosphere phenomenon
to cause global climate variability on interannual time scales.
Here we attempt to monitor ENSO by basing the Multivariate ENSO Index (MEI) on the six main observed variables over the tropical Pacific.
These six variables are: sea-level pressure, zonal and meridional components of the surface wind, sea surface temperature,
surface air temperature, and total cloudiness fraction of the sky."

For MEI values before 1950 see
ESRL-PSD: Extended Multivariate ENSO Index,
a simplified MEI.ext index that extends the MEI record back to 1871,
based on Hadley Centre sea-level pressure and sea surface temperatures, but combined in a similar fashion as the current MEI.

"A transition to ENSO-neutral is likely during late Northern Hemisphere spring or early summer 2016,
with a possible transition to La Niña conditions during the fall.
"Indicative of a strong El Niño, sea surface temperature (SSTs) anomalies
were in excess of 2°C across the east-central equatorial Pacific Ocean during January
(Fig. 1).
The Niño indices in the eastern Pacific declined, while Niño-3.4 and Niño-4 were nearly unchanged
(Fig. 2)".
See
ENSO Diagnostic Discussion - 11 February 2016.

"A strong El Niño is expected to gradually weaken through spring 2016,
and to transition to ENSO-neutral during late spring or early summer."
"A strong El Niño continued during December,
with well above-average sea surface temperatures (SSTs) across the central and eastern equatorial Pacific Ocean
(Fig. 1).
All weekly Niño indices decreased slightly from the previous month
(Fig. 2)".
See
ENSO Diagnostic Discussion - 14 January 2016.

"El Niño is expected to remain strong through the Northern Hemisphere winter 2015-16,
with a transition to ENSO-neutral anticipated during late spring or early summer 2016."
"A strong El Niño continued during November as indicated by well above-average sea surface temperatures (SSTs)
across the central and eastern equatorial Pacific Ocean
(Fig. 1).
The Niño-4, Niño-3.4 and Niño-3 indices rose to their highest levels so far during this event,
while the Niño-1+2 index remained approximately steady
(Fig. 2)".
See
ENSO Diagnostic Discussion - 10 December 2015.

"El Niño will likely peak during the Northern Hemisphere winter 2015-16,
with a transition to ENSO-neutral anticipated during the late spring or early summer 2016."
"A strong El Niño continued during October as indicated by well above-average sea surface temperatures (SSTs)
across the central and eastern equatorial Pacific Ocean
(Fig. 1).
Most Niño indices increased during the month, although the far eastern Niño-1+2 index decreased, accentuating the maximum in anomalous SST farther west
(Fig. 2)".
See
ENSO Diagnostic Discussion - 12 November 2015.

"There is an approximately 95% chance that El Niño will continue through Northern Hemisphere winter 2015-16,
gradually weakening through spring 2016."
"During September, sea surface temperature (SST) anomalies were well above average across the central and eastern Pacific Ocean
(Fig. 1).
The Niño indices generally increased, although the far western Niño-4 index was nearly unchanged
(Fig. 2)".
See
ENSO Diagnostic Discussion - 8 October 2015.

"There is an approximately 95% chance that El Niño will continue through Northern Hemisphere winter 2015-16,
gradually weakening through spring 2016."
"During August, sea surface temperature (SST) anomalies were near or greater than +2.0°C across the eastern half of the tropical Pacific
(Fig. 1).
SST anomalies increased in the Niño-3.4 and Niño 3-regions, were approximately unchanged in the Niño-4 region,
and decreased in the Niño-1+2 region
(Fig. 2)".
See
ENSO Diagnostic Discussion - 10 September 2015.

"There is a greater than 90% chance that El Niño will continue through Northern Hemisphere winter 2015-16,
and around an 85% chance it will last into early spring 2016."
"During July, sea surface temperatures (SST) anomalies were near +1.0°C in the central equatorial Pacific Ocean,
and in excess of +2.0°C across the eastern Pacific
(Fig. 1).
SST anomalies increased in the Niño-3 and Niño-3.4 regions,
while the Niño-4 and Niño-1+2 indices decreased slightly during the month
(Fig. 2)".
See
ENSO Diagnostic Discussion - 13 August 2015.

"There is a greater than 90% chance that El Niño will continue through Northern Hemisphere winter 2015-16,
and around an 80% chance it will last into early spring 2016."
"During June, sea surface temperatures (SST) anomalies exceeded +1.0°C across the central and eastern equatorial Pacific Ocean
(Fig. 1).
The largest SST anomaly increases occurred in the Niño-3 and Niño-3.4 regions,
while the Niño-4 and Niño-1+2 indices remained more constant through the month
Fig. 2)".
See
ENSO Diagnostic Discussion - 9 July 2015.

"There is a greater than 90% chance that El Niño will continue through Northern Hemisphere fall 2015,
and around an 85% chance it will last through the 2015-16 winter."
"During May, sea surface temperatures (SST) anomalies increased across the central and eastern equatorial Pacific Ocean
(Fig. 1
&
Fig. 2).
All of the Niño indices were in excess of +1.0°C, with the largest anomalies in the eastern Pacific,
indicated by recent weekly values of +1.4°C in Niño-3 and +1.9°C in Niño-1+2
(Fig. 2)."
See
ENSO Diagnostic Discussion - 11 June 2015.

"There is an approximately 90% chance that El Niño will continue through Northern Hemisphere summer 2015,
and a greater than 80% chance it will last through 2015."
"By early May 2015,
weak to moderate El Niño conditions were reflected by above-average sea surface temperatures (SST) across the equatorial Pacific
(Fig. 1),
and by the corroborating tropical atmospheric response.
The latest weekly Niño indices were +1.2°C in the Niño-4 region, +1.0°C in the Niño-3.4 region,
and +1.2°C and +2.3°C in the Niño-3 and Niño-1+2 regions, respectively
(Fig. 2)."
See
ENSO Diagnostic Discussion - 14 May 2015.

"There is an approximately 70% chance that El Niño will continue through Northern Hemisphere summer 2015,
and a greater than 60% chance it will last through autumn."
"By the end of March 2015,
weak El Niño conditions were reflected by above-average sea surface temperatures (SST) across the equatorial Pacific
(Fig. 1),
and by the expected tropical atmospheric response.
The latest weekly Niño indices were +1.1°C in the Niño-4 region, +0.7°C in the Niño-3.4 region,
and +0.6°C and +1.4°C in the Niño-3 and Niño-1+2 regions, respectively
(Fig. 2)."
See
ENSO Diagnostic Discussion - 9 April 2015.

"There is an approximately 50-60% chance that El Niño conditions will continue through Northern Hemisphere summer 2015."
"During February 2015,
El Niño conditions were observed as the above-average sea surface temperatures (SST) across the western and central equatorial Pacific
(Fig. 1)
became weakly coupled to the tropical atmosphere.
The latest weekly Niño indices were +0.6°C in the Niño-3.4 region and +1.2°C in the Niño-4 region,
and near zero in the Niño-3 and Niño-1+2 regions
(Fig. 2)."
See
ENSO Diagnostic Discussion - 5 March 2015.

"There is an approximately 50-60% chance of El Niño within the late Northern Hemisphere winter and early spring,
with ENSO-neutral slightly favored thereafter."
"Equatorial sea surface temperatures (SST) remained above average in the western and central Pacific during January 2015
and cooled across the eastern Pacific
(Fig. 1).
Accordingly, the latest weekly Niño indices were +0.5°C in the Niño-3.4 region and +0.9°C in the Niño-4 region,
and closer to zero in the Niño-3 and Niño-1+2 regions
(Fig. 2)."
See
ENSO Diagnostic Discussion - 5 February 2015.

"La Niña events are a vital portion of the El Niño-Southern Oscillation (ENSO) coupled ocean-atmosphere process.
La Niña events recharge the heat released from the tropical Pacific during the El Niño."

"Note that most La Niña events do not fully recharge the heat released by the El Niño events."

"Contrary to the beliefs of anthropogenic warming proponents
the 1997/98 El Niño was NOT fueled by a long-term accumulation of heat from manmade greenhouse gases.
The 1997/98 El Niño was strong enough to temporarily raise Global Lower Troposphere Temperature anomalies ~0.7°C."

"The La Niña event of 1973/74/75/76 provided the tropical Pacific Ocean Heat Content
necessary for the increase in strength and frequency of El Niño events from 1976 to 1995.
The 1995/96 La Niña furnished the Ocean Heat Content that served as fuel for the 1997/98 El Niño.
And the 1998/99/00/01 La Niña recharged the tropical Pacific Ocean Heat Content after the 1997/98 El Niño,
returning it to the new higher level established by the La Niña of 1995/96."

"Global SST anomalies rose and fell over the past 100 years in response to the dominant ENSO phase;
that is, Global SST anomalies rose over multidecadal periods when and because El Niño events prevailed
and they fell over multidecadal periods when and because La Niña events dominated."

"The oceans outside of the central and eastern tropical Pacific integrate the impacts of ENSO,
and it would only require the oceans to accumulate 6% of the annual ENSO signal
in order to explain most of the rise in global SST anomalies since 1910."

According to Dr. Roy Spencer, "Global warming" refers
to the global-average temperature increase that has been observed over the last one hundred years or more.
But to many politicians and the public, the term carries the implication that mankind is responsible for that warming.
This website
describes evidence from my group's government-funded research that suggests global warming is mostly natural,
and that the climate system is quite insensitive to humanity's greenhouse gas emissions and aerosol pollution.

How atmospheric processes like clouds and precipitation systems respond to warming is critical,
as they are either amplifying the warming, or reducing it.
This website
currently concentrates on the response of clouds to warming,
an issue which I am now convinced the scientific community has totally misinterpreted when they have measured natural,
year-to-year fluctuations in the climate system.
As a result of that confusion, they have the mistaken belief that climate sensitivity is high,
when in fact the satellite evidence suggests climate sensitivity is low.

Global Warming as a Natural Response to Cloud Changes Associated with the Pacific Decadal Oscillation (PDO):

According to Dr. Roy Spencer, "most climate change might well be the result of .... the climate system itself!"
Because small, chaotic fluctuations in atmospheric and oceanic circulation systems can cause small changes in global average cloudiness,
this is all that is necessary to cause climate change.

The less you know about how the climate system works, the more fragile the climate system looks to you.
If you simply assert that there are no natural causes of climate change,
you will conclude that our climate system is precariously balanced on a knife edge.

A mostly-natural source of global warming is also consistent with mounting observational evidence
that the climate system is much less sensitive to carbon dioxide emissions than the IPCC's climate models simulate.

"I've usually accepted the premise that increasing atmospheric carbon dioxide concentrations
are due to the burning of fossil fuels by humans.
After all, human emissions average around twice that which is needed to explain the observed rate of increase in the atmosphere.
In other words, mankind emits more than enough CO2 to explain the observed increase in the atmosphere."

"Furthermore, the ratio of the C13 isotope of carbon to the normal C12 form in atmospheric CO2
has been observed to be decreasing at the same time CO2 has been increasing.
Since CO2 produced by fossil fuel burning is depleted in C13 (so the argument goes)
this also suggests a manmade source."

"But when we start examining the details, an anthropogenic explanation for increasing atmospheric CO2 becomes less obvious."

"For example, a decrease in the relative amount of C13 in the atmosphere is also consistent with other biological sources.
And since most of the cycling of CO2 between the ocean, land, and atmosphere is due to biological processes,
this alone does not make a decreasing C13/C12 ratio a unique marker of an anthropogenic source."

"Are Clouds Capable of Causing Temperature Changes?
At the heart of this debate is whether cloud changes,
through their ability to alter how much sunlight is allowed in to warm the Earth, can cause temperature change."

"The IPCC claim is that clouds will change in response to warming in ways which magnify that warming (positive cloud feedback),
but by an unknown amount. All of the 20+ climate models tracked by the IPCC exhibit from weak to strongly positive cloud feedbacks."

"But we claim (and have demonstrated) that causation in the opposite direction [cloud change => temperature change]
gives the illusion of positive cloud feedback, even if negative cloud feedback really exists.
Thus, any attempt to estimate feedback in the real climate system must also address this source of "contamination" of the feedback signal."

"It would be difficult for me to overstate the importance of this issue to global warming theory.
Sufficiently positive cloud feedback could cause a global warming Armageddon.
Sufficiently negative cloud feedback could more than cancel out any other positive feedbacks in the climate system,
and relegate manmade global warming to the realm of just an academic curiosity."

"Cloud feedback happens rapidly, in a matter of days to a few weeks at the very most,
due to the rapidity with which the atmosphere adjusts to a surface temperature change.
It this paper, we even showed evidence that the peak net radiative feedback
(from clouds + temperature + water vapor] occurs within a couple of days of peak temperature."

"I have more extensive evidence now that the lag is closer to zero days."

"In contrast, causation in the opposite direction (clouds forcing temperature change) involves a time lag of many months,
due to the time it takes for the immense thermal inertia of the ocean to allow a temperature response to a change in absorbed sunlight."

"At the end of the day,
the dirty little secret is that there is still no way to test the IPCC climate models for their feedback behavior,
which means there is no way to know which (if any of them) is even close to being correct in its predictions for the future."

"The disconcerting conclusion is
that global warming-related policy decisions are being guided by models
which still have no way to be tested in their long-term predictions."

"Over the last quarter century, mainstream climate science has changed dramatically,
from a paradigm where climate changes naturally to one where climate forever remains the same unless humans meddle with it."

"The reasons for this paradigm shift are clearly not based on science.
Sure, you can always analyze some dataset in such a way that it gives the appearance of climate stasis (e.g. the hockey stick),
but there is plenty of published research over the last 50 years supporting the view that climate changes naturally,
and on all time scales... decadal, centennial, millennial, etc."

"The claim that the Medieval Warm Period or Little Ice Age were only regional in extent
is countered with considerable published evidence to the contrary.
Besides... why is it that the pundits who claim these historic events were only regional in extent
are the same people who place global significance on a U.S. drought or a heat wave in France? Hmmm?

"No, the reasons for this paradigm shift are mostly political.
Scientists play along for a variety of reasons which would take a series of blog posts to cover."

"Chaos theory was originally developed by Ed Lorenz during early experiments with computerized weather prediction models,
the forerunners of today's climate models.
Lorenz found that, for example,
even tiny changes in the initial state of the atmosphere can completely change how weather patterns evolve in the coming weeks.
Chaos is what limits the predictability of weather to 10 days or so."

"Chaotic behavior is a characteristic of most nonlinear dynamical systems,
that is, systems which evolve over time and are governed by rather complex physical processes.
We usually think of chaos in the atmosphere operating on time scales of days to weeks."

"But the ocean is also a nonlinear dynamical system.
And it has time scales ranging from years up to hundreds or even thousands of years... time scales we associate with climate change."

"El Nino and La Nina can, for example, be thought of as a chaotic fluctuation in the climate system.
Like the famous butterfly-shaped Lorenz Attractor, El Nino and La Nina are the two wings of the butterfly,
and the climate system during Northern Hemisphere winter tends to alternate between El Nino and La Nina,
sometimes getting "stuck" in a multi-year pattern of more frequent El Ninos or La Ninas."

"Now, while El Nino and La Nina are the best known (and most frequently occurring) ocean-based climate phenomenon,
what other longer-term modes of climate variability might there be which are "unforced"?
By unforced, I mean they are not caused by some external forcing mechanism (like the sun),
but are just the natural results of how the system varies all by itself.
Well, we really don't know, partly because so little research is funded to study the problem."

"It is my belief that most climate variability and even climate change could simply be the result of chaos in the climate system.
By how would changing ocean and atmospheric circulation patterns cause "global warming"?"

"One potential mechanism is through the impact of those circulation changes on cloud formation."

"Clouds are the Earth's natural sunshade, and very small (but persistent) changes in cloud cover can cause either warming or cooling trends.
I know that scientists like Trenberth and Dessler like to claim that "clouds don't cause climate change"... well,
chaotic changes in ocean and atmospheric circulation patterns can change clouds, and so in that sense clouds act as an intermediary.
Of course clouds don't change all by themselves, which is how some people disingenuously characterize my position on this."

"Unfortunately, our long-term measurements of global cloud cover
are not yet good enough to determine with a high level of confidence just how much recent warming was caused by climate chaos.
Our experiments with a simple 1D energy budget model suggest
that more frequent El Ninos since the late 1970s caused some of the warming we have seen (a position also taken by Bob Tisdale),
but just how much of the warming remains uncertain."

"Part of the El Nino warming seems to be through reduced cloud cover, which precedes peak warming by 7 to 9 months.
But it is also through a decrease in the rate at which the ocean mixes heat vertically.
Chaotic changes in ocean mixing alone can cause global warming or cooling,
even without any cloud changes, the result of the fact that most of the depth of the ocean is very cold,
and only the near-surface is relatively warm.
If the ocean was vertically uniform in temperature, changes in ocean mixing would have little effect on climate."

"This issue of natural mechanisms of climate change is so important
it boggles my mind that the U.S. Government has had almost zero interest in funding it.
But I don't see how we will ever confidently determine just how much of recent warming is human-induced
without determining how much was natural."

"If, say, 50% of the warming in the last 50 to 100 years has been natural,
then this profoundly impacts our projections of human-caused warming in the future, slashing them by about 50%."

Climate Science: Is It Currently Designed To Answer Questions?:
Richard S. Lindzen, 29 Nov. 2008

"We have the new paradigm where simulation and programs have replaced theory and observation." - Richard Lindzen

When an issue becomes a vital part of a political agenda, as is the case with climate,
then the politically desired position becomes a goal rather than a consequence of scientific research.

Science is primarily a successful mode of inquiry rather than a source of authority.

It is my impression that by the end of the 60's scientists, themselves,
came to feel that the real basis for support was not gratitude
(and the associated trust that support would bring further benefit) but fear: fear of the Soviet Union, fear of cancer, etc.

However, between the perceptions of gratitude and fear as the basis for support lies a world of difference in incentive structure.
If one thinks the basis is gratitude, then one obviously will respond by contributions that will elicit more gratitude.
The perpetuation of fear, on the other hand, militates against solving problems.

However, the end of the cold war, by eliminating a large part of the fear-base forced a reassessment of the situation.
Most thinking has been devoted to the emphasis of other sources of fear: competitiveness, health, resource depletion and the environment.

The 60's saw the first major postwar funding cuts for science in the US.
The budgetary pressures of the Vietnam War may have demanded savings someplace,
but the fact that science was regarded as, to some extent, dispensable, came as a shock to many scientists.
So did the massive increase in management structures and bureaucracy which took control of science out of the hands of working scientists.

Fear has several advantages over gratitude.
Gratitude is intrinsically limited, if only by the finite creative capacity of the scientific community.
Moreover, as pointed out by a colleague at MIT, appealing to people's gratitude and trust is usually less effective than pulling a gun.
In other words, fear can motivate greater generosity.

Science since the sixties has been characterized by the large programs that this generosity encourages.
Moreover, the fact that fear provides little incentive for scientists to do anything more than perpetuate problems,
significantly reduces the dependence of the scientific enterprise on unique skills and talents.

One result of the above appears to have been the deemphasis of theory because of its intrinsic difficulty and small scale,
the encouragement of simulation instead (with its call for large capital investment in computation),
and the encouragement of large programs unconstrained by specific goals.

In brief, we have the new paradigm where simulation and programs have replaced theory and observation,
where government largely determines the nature of scientific activity,
and where the primary role of professional societies is the lobbying of the government for special advantage.

This new paradigm for science and its dependence on fear-based support may not constitute corruption per se,
but it does serve to make the system particularly vulnerable to corruption.
Much of the remainder of this paper will illustrate the exploitation of this vulnerability in the area of climate research.
The situation is particularly acute for a small weak field like climatology.
As a field, it has traditionally been a subfield within such disciplines as meteorology, oceanography, geography, geochemistry, etc.
These fields, themselves are small and immature.
At the same time, these fields can be trivially associated with natural disasters.
Finally, climate science has been targeted by a major political movement, environmentalism, as the focus of their efforts,
wherein the natural disasters of the earth system, have come to be identified with man's activities
- engendering fear as well as an agenda for societal reform and control.

The temptation to politicize science is overwhelming and longstanding.
Public trust in science has always been high, and political organizations have long sought to improve their own credibility
by associating their goals with 'science' - even if this involves misrepresenting the science.

Given the above, it would not be surprising if working scientists would make special efforts to support the global warming hypothesis.
There is ample evidence that this is happening on a large scale.

Although the situation suggests overt dishonesty, it is entirely possible, in today's scientific environment,
that many scientists feel that it is the role of science
to vindicate the greenhouse paradigm for climate change as well as the credibility of models.

In the history of the global-warming movement,
no scientist is more revered than Roger Revelle of Scripps Institution of Oceanography,
Harvard University and University of California San Diego.
He was the co-author of the seminal 1957 paper that demonstrated that fossil fuels had increased carbon-dioxide levels in the air.
Under his leadership, the President's Science Advisory Committee Panel on Environmental Pollution in 1965
published the first authoritative U.S. government report in which carbon dioxide from fossil fuels was officially recognized
as a potential global problem.
He was the author of the influential 1982 Scientific American article that elevated global warming on to the public agenda.
For being "the grandfather of the greenhouse effect", as he put it,
he was awarded the National Medal of Science by the first President Bush.
Roger Revelle's most consequential act, however, may have come in his role as a teacher, during the 1960s at Harvard.
Dr. Revelle inspired a young student named Al Gore.
While Gore in the late 1980s was becoming a prominent politician, loudly warning of globalwarming dangers,
Dr. Revelle was quietly warning against taking any drastic action.

In a July 14, 1988, letter to Congressman Jim Bates, he wrote that:
"Most scientists familiar with the subject are not yet willing to bet that the climate this year is the result of 'greenhouse warming'.
As you very well know, climate is highly variable from year to year,
and the causes of these variations are not at all well understood.
My own personal belief is that we should wait another 10 or 20 years to really be convinced
that the greenhouse is going to be important for human beings, in both positive and negative ways".
A few days later, he sent a similar letter to Senator Tim Wirth,
cautioning "... we should be careful not to arouse too much alarm until the rate and amount of warming becomes clearer".

Then in 1991, Dr. Revelle wrote an article for Cosmos, a scientific journal, with two illustrious colleagues,
Chauncey Starr, founding director of the Electric Power Research Institute and Fred Singer,
the first director of the U.S. Weather Satellite Service.
Entitled "What to do about greenhouse warming: Look before you leap",
the article argued that decades of research could be required for the consequences of increased carbon dioxide to be understood,
and laid out the harm that could come of acting recklessly:
"Drastic, precipitous and, especially, unilateral steps to delay the putative greenhouse impacts
can cost jobs and prosperity and increase the human costs of global poverty, without being effective.
Stringent controls enacted now would be economically devastating,
particularly for developing countries for whom reduced energy consumption
would mean slower rates of economic growth without being able to delay greatly the growth of greenhouse gases in the atmosphere.
Yale economist William Nordhaus,
one of the few who have been trying to deal quantitatively with the economics of the greenhouse effect,
has pointed out that '... those who argue for strong measures to slow greenhouse warming
have reached their conclusion without any discernible analysis of the costs and benefits ...'.
It would be prudent to complete the ongoing and recently expanded research so that we will know what we are doing before we act.
'Look before you leap' may still be good advice".

Three months after the Cosmos article appeared, Dr. Revelle died of a heart attack.

Dr. S. Fred Singer is Professor Emeritus at the University of Virginia and chairman of the Science & Environmental Policy Project (SEPP).
His specialty is atmospheric and space physics.
An expert in remote sensing and satellites,
he served as the founding director of the US Weather Satellite Service and, more recently,
as vice chair of the US National Advisory Committee on Oceans & Atmosphere.
In 2007, he founded the Nongovernmental International Panel on Climate Change (NIPCC),
providing an alternative scientific voice to the UN's IPCC (intergovernmental Panel on Climate Change).
He edited the first NIPCC report:
Nature, Not Human Activity, Rules the Climate
(April 2008) and co-authored the 2009 NIPCC report:
Climate Change Reconsidered (2 June 2009).

The Nongovernmental International Panel on Climate Change (NIPCC) is what its name suggests:
an international panel of nongovernment scientists and scholars
who have come together to understand the causes and consequences of climate change.
Because we are not predisposed to believe climate change is caused by human greenhouse gas emissions,
we are able to look at evidence the Intergovernmental Panel on Climate Change (IPCC) ignores.
Because we do not work for any governments, we are not biased toward the assumption that greater government activity is necessary.

Whereas the reports of the United Nations' Intergovernmental Panel on Climate Change (IPCC) warn of a dangerous human effect on climate,
NIPCC concludes the human effect is likely to be small relative to natural variability,
and whatever small warming is likely to occur will produce benefits as well as costs.

The Summary for Policymakers released in September by the United Nations' Intergovernmental Panel on Climate Change
is filled with concessions that its past predictions were too extreme and misleading and in unscientific language,
according to a team of scientists from the U.S. and Australia.

The
NIPCC CCR-II, Summary for Policymakers
(September 25, 2013, .pdf) is 24 pages long and was written in collaboration with the lead authors and approved by them.
Because it is aimed at a larger popular audience than the book,
it adds a discussion of the scientific method and the precautionary principle,
a brief summary and critical analysis of each of the IPCC's main lines of argument, and a brief set of recommendations for policymakers.
We also recommend you review the separate
NIPCC CCR-II, Executive Summary.
[October 17, 2013, .pdf, 5 pages]

Part two of NIPCC Climate Change Reconsidered II: Impacts, Adaptation, and Vulnerabilities - Biological Impacts
was released in March 31, 2014.

"Climate Change Reconsidered II: Biological Impacts describes thousands of peer-reviewed scientific journal articles that do not support,
and often flatly contradict, IPCC's pessimistic narrative of "death, injury, and disrupted livelihoods".
The impact of rising temperatures and higher atmospheric CO2 levels in the twentieth and early twenty-first centuries
has not been anything like what IPCC would have us believe, and its forecasts differ wildly from those sound science would suggest."

"How CO2 enrichment has affected global food production and biospheric productivity is a matter of fact, not opinion.
The evidence is overwhelming that it has and will continue to help plants thrive,
leading to greater biodiversity, shrinking deserts, expanded habitat for wildlife, and more food for a growing human population."

Increased levels of carbon dioxide (CO2) have helped boost green foliage across the world's arid regions over the past 30 years
through a process called CO2 fertilisation, according to CSIRO research.

Satellite data shows the per cent amount that foliage cover has changed around the world from 1982 to 2010.

In findings based on satellite observations, CSIRO, in collaboration with the Australian National University (ANU),
found that this CO2 fertilisation correlated with an 11 per cent increase in foliage cover from 1982-2010
across parts of the arid areas studied in Australia, North America, the Middle East and Africa,
according to CSIRO research scientist, Dr. Randall Donohue.

In Australia, our native vegetation is superbly adapted to surviving in arid environments and it consequently uses water very efficiently,
Dr. Donohue said. Australian vegetation seems quite sensitive to CO2 fertilisation.

The fertilisation effect occurs where elevated CO2 enables a leaf during photosynthesis,
the process by which green plants convert sunlight into sugar,
to extract more carbon from the air or lose less water to the air, or both.

While a CO2 effect on foliage response has long been speculated, until now it has been difficult to demonstrate, according to Dr. Donohue.

Our work was able to tease-out the CO2 fertilisation effect by using mathematical modelling
together with satellite data adjusted to take out the observed effects of other influences such as precipitation, air temperature,
the amount of light, and land-use changes.

If elevated CO2 causes the water use of individual leaves to drop,
plants in arid environments will respond by increasing their total numbers of leaves.
These changes in leaf cover can be detected by satellite,
particularly in deserts and savannas where the cover is less complete than in wet locations, according to Dr. Donohue.

From "Deserts 'greening' from rising CO2".
CSIRO, the Commonwealth Scientific and Industrial Research Organisation. Australia's national science agency. July 3, 2013.
At http://www.csiro.au/en/Portals/Media/Deserts-greening-from-rising-CO2.aspxThis page was removed from the CSIRO Website.

It's amazing that minuscule bacteria can cause life-threatening diseases and infections
- and miraculous that tiny doses of vaccines and antibiotics can safeguard us against these deadly scourges.
It is equally incredible that, at the planetary level, carbon dioxide is a miracle molecule for plants
- and the "gas of life" for most living creatures on Earth.

In units of volume, CO2's concentration is typically presented as 400 parts per million (400 ppm).
Translated, that's just 0.04% of Earth's atmosphere.
Even atmospheric argon is 23 times more abundant: 9,300 ppm.
Moreover, the 400 ppm in 2013 is 120 ppm more than the 280 ppm CO2 level of 1800.

Over the past two centuries, our planet finally began to emerge from the Little Ice Age
that had cooled the Earth and driven Viking settlers out of Greenland.
Warming oceans slowly released some of the CO2 stored in their waters.
Industrial Revolution factories and growing human populations burned more wood and fossil fuels, baked more bread, and brewed more beer,
adding still more CO2 to the atmosphere.
Much more of the miracle molecule came from volcanoes and sub-sea vents, forest fires, bio-fuels use, decaying plants and animals,
and "exhaust" from living, breathing animals and humans.

What a difference that extra 120 ppm has made for plants, and for animals and humans that depend on them.
The more CO2 there is in the atmosphere, the more it is absorbed by plants of every description - and the faster and better they grow,
even under adverse conditions like limited water, extremely hot air temperatures, or infestations of insects, weeds and other pests.
As trees, grasses, algae and crops grow more rapidly and become healthier and more robust,
animals and humans enjoy better nutrition on a planet that is greener and greener.

One of the worst things that could happen to our planet and its people, animals, and plants,
would be for CO2 levels to plunge back to levels last seen before the Industrial Revolution.
Decreasing CO2 levels would be especially problematical if Earth cools, in response to the Sun entering another "quiet phase",
as happened during the Little Ice Age.
If Earth cools again, growing seasons would shorten and arable cropland would decrease in the northern temperate zones.
We would then need every possible molecule of CO2 - just to keep agricultural production high enough to stave off mass human starvation ...
and save wildlife habitats from being plowed under to replace that lost cropland.

(1) The Recovery from the Little Ice Age (A Possible Cause of Global Warming)

(2) The Multi-decadal Oscillation (The Recent Halting of the Warming)

Two natural components of the currently progressing climate change are identified.
The first one is an almost linear global temperature increase of about 0.5°C/100 years,
which seems to have started in 1800-1850,
at least one hundred years before 1946 when manmade CO2 in the atmosphere began to increase rapidly.
This 150~200-year-long linear warming trend is likely to be a natural change.
One possible cause of this linear increase may be the earth's continuing recovery from the Little Ice Age (1400~1800);
the recovery began in 1800~1850.
This trend (0.5°C/100 years) should be subtracted from the temperature data during the last 100 years
when estimating the manmade contribution to the present global warming trend.
As a result,
there is a possibility that only a small fraction of the present warming trend
is attributable to the greenhouse effect resulting from human activities.

It is also shown that various cryosphere phenomena,
including glaciers in many places in the world and sea ice in the Arctic Ocean that had developed during the Little Ice Age,
began to recede after 1800 and are still receding; their recession is thus not a recent phenomenon.

The second one is oscillatory (positive/negative) changes, which are superposed on the linear change.
One of them is the multi-decadal oscillation [PDO], which is a natural change.
This particular natural change had a positive rate of change of about 0.15°C/10 years from about 1975
(positive from 1910 to 1940, negative from 1940 to 1975),
and is thought by the IPCC to be a sure sign of the greenhouse effect of CO2.
However, the positive trend from 1975 has stopped after 2000.
One possibility of the halting is that after reaching a peak in 2000,
the multi-decadal oscillation has begun to overwhelm the linear increase,
causing the IPCC prediction to fail as early as the first decade of the 21st century.

There is an urgent need to correctly identify natural changes and remove them from the present global warming/cooling trend,
in order to accurately and correctly identify the contribution of the manmade greenhouse effect.
Only then can the effects of CO2 be studied quantitatively.
Arctic research should be able to contribute greatly to this endeavor.

A number of published papers and openly available data on sea level changes, glacier retreat, freezing/break-up dates of rivers,
sea ice retreat, tree-ring observations, ice cores and changes of the cosmic-ray intensity, from the year 1000 to the present,
are studied to examine how the Earth has recovered from the Little Ice Age (LIA).
We learn that the recovery from the LIA has proceeded continuously, roughly in a linear manner, from 1800-1850 to the present.
The rate of the recovery in terms of temperature is about 0.5°C/100 years
and thus it has important implications for understanding the present global warming.
It is suggested on the basis of a much longer period covering that the Earth is still in the process of recovery from the LIA;
there is no sign to indicate the end of the recovery before 1900.
Cosmic-ray intensity data show that solar activity was related to both the LIA and its recovery.
The multi-decadal oscillation of a period of 50 to 60 years was superposed on the linear change;
it peaked in 1940 and 2000, causing the halting of warming temporarily after 2000.
These changes are natural changes, and in order to determine the contribution of the manmade greenhouse effect,
there is an urgent need to identify them correctly and accurately and remove them.

The rise in global average temperature over the last century has halted since roughly the year 2000,
despite the fact that the release of CO2 into the atmosphere is still increasing.
It is suggested here that this interruption has been caused by the suspension of the near linear (+0.5°C/100 years or 0.05°C/10 years)
temperature increase over the last two centuries, due to recovery from the Little Ice Age,
and by a superposed multi-decadal oscillation of a 0.2°C amplitude and a 50~60 year period,
which reached its positive peak in about the year 2000 - a halting similar to those that occurred around 1880 and 1940.
Because both the near linear change and the multi-decadal oscillation are likely to be natural changes
(the recovery from the Little Ice Age (LIA) and an oscillation related to the Pacific Decadal Oscillation (PDO), respectively),
they must be carefully subtracted from temperature data before estimating the effects of CO2.

Figure 5 shows the above findings in graphic form and represents an improved version of Figure 9 of Akasofu (2010).
The large rectangular box shaded in yellow shows temperature changes from 1860 to 2010 (standard data),
together with a straight black line showing the 0.5°C/100 year rate of increase and the multi-decadal oscillation shown in red and blue,
above and below the line, respectively.
The dotted line before 1860 indicates that the linear line may be extended back to about 1800,
assuming that the LIA indeed began to recover from about 1800.
An insert above the yellow box is a detailed version of data shown in yellow.
The HadCRUT4 data are discussed by Morice et al (2012).

Figure 5. An interpretation of changes in global average temperature from 1800 to 2012.
The temperature in the vertical axis is for reference scale; for detail, see the text.
Figure 5 also shows a detailed version of data shown in the yellow box.

In the yellow box, the change from 2000 to June of 2012 is emphasized by the thick blue line
to indicate the halting trend as an effect of the multi-decadal oscillation.
Above the yellow box is shown a detailed version of this data.
Based on the above synthesis, it may be suggested that the present halt to global warming
is due to the fact that multi-decadal oscillation has overwhelmed the prior, near linear (LIA recovery) increase.
Indeed, such a trend is similar to those after 1880 and 1940, when temperature actually decreased toward 1910 and 1970,
respectively (particularly in light of the fact that CO2 had begun to increase rapidly after 1946).
It may be noted, however, that Levitus et al. (2012) showed a continuous increase of the world ocean heat content after 2000,
although the rate of increase seems to decrease after 2004;
on the other hand, the result by Pielke (2008) does not seem to show such an increase after 2000.

It is clear from the above data set that the warming trend is halted and that there is an indication of even a slight cooling after 2000.

"Professor Murry Salby is Chair of Climate Science at Macquarie University.
He's been on visiting professorships at Paris, Stockholm, Jerusalem, and Kyoto,
and he's spent time at the Bureau of Meteorology in Australia."

"Over the last two years he has been looking at C12 and C13 ratios and CO2 levels around the world,
and has come to the conclusion that man-made emissions have only a small effect on global CO2 levels.
It's not just that man-made emissions don't control the climate, they don't even control global CO2 levels."

"Carbon dioxide is emitted by human activities as well as a host of natural processes.
The satellite record, in concert with instrumental observations,
is now long enough to have collected a population of climate perturbations,
wherein the Earth-atmosphere system was disturbed from equilibrium.
Introduced naturally, those perturbations reveal that net global emission of CO2 (combined from all sources, human and natural)
is controlled by properties of the general circulation
- properties internal to the climate system that regulate emission from natural sources.
The strong dependence on internal properties indicates that emission of CO2 from natural sources,
which accounts for 96 per cent of its overall emission, plays a major role in observed changes of CO2.
Independent of human emission, this contribution to atmospheric carbon dioxide is only marginally predictable and not controllable."

Salby's talk
was given in August 2 '11 at the International Union of Geodesy and Geophysics meeting in Melbourne Australia.
He indicates that a journal paper is in press, with an expectation of publication a few months out.

Murry Salby was sacked from Macquarie University, and Macquarie struggled to explain why, among other things,
it was necessary to abandon, and strand him in Paris and hold a "misconduct" meeting in his absence.

"Professor Robert (Bob) M. Carter
is an adjunct Research Fellow at James Cook University (Queensland).
He is a palaeontologist, stratigrapher, marine geologist and environmental scientist with more than 40 years professional experience,
and holds degrees from the University of Otago (New Zealand) and the University of Cambridge (England).
He has held tenured academic staff positions at the University of Otago (Dunedin) and James Cook University (Townsville),
where he was Professor and Head of School of Earth Sciences between 1981 and 1999."

"Climate change knows three realities.
Science reality, which is what working scientists deal with on a daily basis.
Virtual reality, which is the wholly imaginary world inside computer climate models.
And public reality, which is the socio-political system within which politicians, business people and the general citizenry work.
The science reality is that climate is a complex, dynamic, natural system that no one wholly comprehends,
though many scientists understand different small parts.
So far, and despite the very strong public concern,
science provides no unambiguous evidence that dangerous or even measurable human-caused global warming is occurring.
Second, the virtual reality is that computer models predict future climate according to the assumptions that are programmed into them.
There is no established Theory of Climate,
and therefore the potential output of all realistic computer general circulation models (GCMs)
encompasses a range of both future warmings and coolings,
the outcome depending upon the way in which a particular model run is constructed.
Different results can be produced at will simply by adjusting such poorly known parameters as the effects of cloud cover.
Third, public reality is that, driven by strong environmental lobby groups and evangelistic scientists and journalists,
to whom politicians in turn respond,
there was a widespread but erroneous belief in our society in 2009
that dangerous global warming is occurring and that it has human causation."

"The current public 'debate' on climate is not so much a debate as it is an incessant and shrill campaign
to scare the global citizenry into accepting dramatic changes in their way of life
in pursuit of the false god of preventing dangerous global warming."

"Global, cyclic, decadal, climate patterns can be traced over the past millennium in glacier fluctuations,
oxygen isotope ratios in ice cores, sea surface temperatures, and historic observations.
The recurring climate cycles clearly show that natural climatic warming and cooling have occurred many times,
long before increases in anthropogenic atmospheric CO2 levels.
The Medieval Warm Period and Little Ice Age are well known examples of such climate changes, but in addition,
at least 23 periods of climatic warming and cooling have occurred in the past 500 years.
Each period of warming or cooling lasted about 25-30 years (average 27 years).
Two cycles of global warming and two of global cooling have occurred during the past century,
and the global cooling that has occurred since 1998 is exactly in phase with the long term pattern.
Global cooling occurred from 1880 to ~1915; global warming occurred from ~1915 to ~1945; global cooling occurred from ~1945-1977;
global warming occurred from 1977 to 1998; and global cooling has occurred since 1998.
All of these global climate changes show exceptionally good correlation with solar variation since the Little Ice Age 400 years ago."

"The IPCC predicted global warming of 0.6°C (1°F) by 2011 and 1.2°C (2°F) by 2038,
whereas Easterbrook (2001) predicted the beginning of global cooling by 2007 (±3-5 yrs) and cooling of about 0.3-0.5°C until ~2035.
The predicted cooling seems to have already begun.
Recent measurements of global temperatures suggest a gradual cooling trend since 1998 and 2007-2008 was a year of sharp global cooling.
The cooling trend will likely continue as the sun enters a cycle of lower irradiance
and the Pacific Ocean changed from its warm mode to its cool mode."

"The real question now is not trying to reduce atmospheric CO2 as a means of stopping global warming,
but rather (1) how can we best prepare to cope with the 30 years of global cooling that is coming,
(2) how cold will it get, and (3) how can we cope with the cooling during a time of exponential population increase?"

The Sun and ocean undergo regular changes on regular and predictable time frames.
Temperatures likewise have exhibited changes that are cyclical.
Sir Gilbert Walker was generally recognized as the first to find large-scale oscillations in atmospheric variables.
As early as 1908, while on a mission to explain why the Indian monsoon sometimes failed,
he assembled global surface data and did a thorough correlation analysis.

On careful interpretation of statistical data, Walker and Bliss (1932) were able to identify three pressure oscillations:
1. A flip flop on a big scale between the Pacific Ocean and the Indian Ocean which he called the Southern Oscillation (SO).
2. A second oscillation, on a much smaller scale, between the Azores and Iceland, which he named the North Atlantic Oscillation.
3. A third oscillation between areas of high and low pressure in the North Pacific, which Walker called the North Pacific Oscillation.

Walker further asserted that the SO is the predominant oscillation, which had a tendency to persist for at least 1-2 seasons.
He went so far in 1924 as to suggest the SO index had global weather impacts and might be useful in predicting the world's weather.
He was ridiculed by the scientific community at the time for these statements.
Not until four decades later was the Southern Oscillation recognized as a coupled atmosphere pressure and ocean temperature phenomena
(Bjerknes, 1969) and more than two decades further before it was shown to have statistically significant global impacts and could be used
to predict global weather/climate, at times many seasons in advance. Walker was clearly a man ahead of his time.

Global temperatures, ocean-based teleconnections, and solar variances interrelate with each other.
A team of mathematicians (Tsonis et al., 2003, 2007), led by Dr. Anastasios Tsonis,
developed a model suggesting that known cycles of the Earth's oceans - the Pacific Decadal Oscillation, the North Atlantic Oscillation,
El Nino (Southern Oscillation), and the North Pacific Oscillation - all tend to synchronize with each other.
The theory is based on a branch of mathematics known as Synchronized Chaos.
The model predicts the degree of coupling to increase over time, causing the solution to "bifurcate", or split.
Then, the synchronization vanishes. The result is a climate shift.
Eventually the cycles begin to synchronize again, causing a repeating pattern of warming and cooling,
along with sudden changes in the frequency and strength of El Nino events.
They show how this has explained the major shifts that have occurred including 1913, 1942, and 1978.
These may be in the process of synchronizing once again with its likely impact on climate
very different from what has been observed over the last several decades.

12. Where Are We Headed During The Coming Century?

The cool phase of PDO is now entrenched.
We have shown how the two ocean oscillations drive climate shifts.
The PDO leads the way and its effect is later amplified by the AMO.
Each of this has occurred in the past century, global temperatures have remained cool for about 30 years.

No statistically significant global warming has taken place since 1998 (see UAH Satellite-Based Global Temperature Record),
and cooling has occurred during the past several years (Hanna and Cappelen, 2003).
A very likely reason for global cooling over the past decade is the switch of the Pacific Ocean from its warm mode
(where it has been from 1977 to 1998) to its cool mode in 1999.
Each time this has occurred in the past century, global temperatures have remained cool for about 30 years.
Thus, the current sea surface temperatures not only explain why we have had stasis or global cooling for the past 10 years,
but also should assure that cooler temperatures will continue for several more decades.
There will be brief bounces upwards with periodic El Ninos, as we have seen in late 2009 and early 2010,
but they will give way to cooling as the favored La Nina states returns.
With a net La Nina tendency, the net result should be cooling.

12.1. Predictions Based on Past Climate Patterns

The past is the key to understanding the future.
Past warming and cooling cycles over the past 500 years were used by Easterbrook (2001, 2005, 2006a,b, 2007, 2008a,b,c)
to accurately predict the cooling phase that is now happening.
Establishment of cool Pacific sea surface temperatures since 1999 indicates that the cool phase will persist for the next several decades.
We can look to past natural climatic cycles as a basis for predicting future climate changes.
The climatic fluctuations over the past few hundred years suggest ~30-year climatic cycles of global warming and cooling,
on a general warming trend from the Little Ice Age cool period.
If the trend continues as it has for the past several centuries,
global temperatures for the coming century might look like those in Fig. 23.

The left side of Fig. 23 is the warming/cooling history of the past century.
The right side of the graph shows that we have entered a global cooling phase that fits the historic pattern very well.
The switch to the PDO cool mode virtually assures cooling global climate for several decades.

Three possible projections are shown in Fig. 24:
(1) moderate cooling (similar to the 1945-1977 cooling);
(2) deeper cooling (similar to the 1880-1915 cooling); or
(3) severe cooling (similar to the 1790-1820 cooling).

Only time will tell which of these will be the case, but at the moment,
the Sun is behaving very similar to the Dalton Minimum (sunspot cycles 4/5), which was a very cold time.
This is based on the similarity of sun spot cycle 23 to cycle 4 (which immediately preceded the Dalton Minimum).

As the global climate and solar variation reveals themselves in a way not seen in the past 200 years,
we will surely attain a much better understanding of what causes global warming and cooling. Time will tell.
If the climate continues its ocean cycle cooling and the Sun behaves in a manner not witnessed since 1800,
we can be sure that climate changes are dominated by the Sun and sea and that atmospheric CO2 has a very small role in climate changes.
If the same climatic patterns, cyclic warming and cooling, that occurred over the past 500 years continue,
we can expect several decades of moderate to severe global cooling.

Note:
No statistically significant global warming has taken place since 1998 (see the UAH Satellite-Based Global Temperature Record),
but the switch to a predominant La Niña period forecasted in Fig. 23 for ~2013, and the start of global cooling,
forecasted in Fig. 24 for ~2014, have not happened yet. [March 27, 2015]

The Intergovernmental Panel on Climate Change (IPCC):

"IPCC predicts rapid, exponential CO2 growth that is not occurring."

"The IPCC assume CO2 concentration will rise exponentially from today's 385 parts per million to reach 730 to 1,020 ppm,
central estimate 836 ppm, by 2100."
"However, for seven years, CO2 concentration has been rising in a straight line towards just 570 ppm by 2100."

"Since 1980 global temperature has risen at only 2.7°F (1.5°C)/century, not 6°F (3.4°C) as IPCC predicts."

"Sea level rose just 8 inches (20 cm) in the 20th century, and has been rising since 1993 at a very modest 1 ft/century (30.5 cm/century)."

"The observed increase in global mean surface temperature over the industrial era
is less than 40% of that expected from observed increases in long-lived greenhouse gases together with
the best-estimate equilibrium climate sensitivity
given by the 2007 Assessment Report of the Intergovernmental Panel on Climate Change (IPCC)."

"Based on current model results, we predict:
An average rate of increase of global mean temperature during the next century of about 0.3°C per decade
(with an uncertainty range of 0.2-0.5°C per decade) assuming the IPCC Scenario A (Business-as-Usual) emissions of greenhouse gases."

"They predicted that if our emissions stayed the same, temperatures would rise by 0.3°C per decade,
and would be at the very least 0.2, and the most 0.5.
Even by the most generous rehash of the data, the highest rate they can find is 0.18°C per decade which is likely an overestimate,
and in any case, is below the very least estimate, despite the world's emissions of CO2 continuing ever higher."

IPCC 2012, Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX).
Summary for Policymakers. (drafted 18 November 2011, published 29 March 2012)

Part D. Future Climate Extremes, Impacts, and Disaster Losses

"Projected changes in climate extremes under different emissions scenarios
generally do not strongly diverge in the coming two to three decades,
but these signals are relatively small compared to natural climate variability over this time frame.
Even the sign of projected changes in some climate extremes over this time frame is uncertain."

"The IPCC plays a very influential role in the world, and it is imperative that its operations be unimpeachable.
Yet the oversight mechanisms of the IPCC simply do not appear to be adequate to assure this."

"This report reviews the IPCC procedures in detail and points out a number of weaknesses.
Principally, the IPCC Bureau has a great deal of arbitrary power over the content and conclusions of the assessment reports.
It faces little restraint in the review process due to weaknesses in the current rules.
And the government delegates who comprise the plenary Panel provide what appears to be largely passive and ineffective oversight."

Ross R. McKitrick is Professor of Economics at the University of Guelph in Ontario, Canada.
He is a Senior Fellow of the Fraser Institute and a member of the Academic Advisory Council of The Global Warming Policy Foundation (GWPF).

The instrumental data is spatially and temporally inadequate.
Surface weather data is virtually non-existent and unevenly distributed for 85 percent of the world's surface.
There are virtually none for 70 percent of the oceans.
On the land, there is virtually no data for the 19 percent mountains, 20 percent desert, 20 percent boreal forest, 20 percent grasslands,
and 6 percent tropical rain forest.
In order to "fill-in", the Goddard Institute for Space Studies (GISS),
made the ridiculous claim that a single station temperature was representative of a 1,200 km radius region.

Most surface stations are concentrated in eastern North America and Western Europe
and became the early evidence for human induced global warming.
IPCC advocates ignored, for a long time, the fact that these stations are most affected by the urban heat island effect (UHIE).

There is a consistent revision of the record to lower historic readings. This increases the gradient of supposed warming.
The other tell tale sign is that virtually all adjustments occur before the UAH satellite temperature record began in 1991.
20th century temperature trends begin with warming from 1900 to 1940, cooling from 1940 to 1980,
warming from 1980 to 1998 and a slight cooling trend to 2014.
If we accept overall warming from 1900, which is reasonable as the Earth emerges from the Little Ice Age (LIA),
then the highest temperatures will occur in the most recent record.
Identifying that 2014 was fractionally warmer than any other in the record does not change the trend of the "pause".
It does not enhance the CO2 causation claim.

Since 2009, NASA's Operation IceBridge has flown over Greenland more than one hundred times with a wide variety of instruments,
including radar, and generated vast quantities of data, adding to the work from many other missions.
This has allowed researchers to generate a three dimensional map depicting the age of the ice throughout the Greenland Ice sheet.

This 3D age map shows that three distinct periods of climate are evident within the ice sheet: The Holocene, shown here in green.
The last ice age, shown in blue. And the Eemian, shown here in red.

The top layers from the Holocene Period, formed during the last 11.7 thousand years and are fairly flat and uniform,
though the thickness varies depending on how much snowfall occurred.

"Physical, mathematical and observational grounds are employed to show that
there is no physically meaningful global temperature for the Earth in the context of the issue of global warming.
While it is always possible to construct statistics for any given set of local temperature data,
an infinite range of such statistics is mathematically permissible
if physical principles provide no explicit basis for choosing among them.
Distinct and equally valid statistical rules can and do show opposite trends when applied to
the results of computations from physical models and real data in the atmosphere.
A given temperature field can be interpreted as both "warming" and "cooling" simultaneously,
making the concept of warming in the context of the issue of global warming physically ill-posed."

"There is no global temperature.
The reasons lie in the properties of the equation of state governing local thermodynamic equilibrium,
and the implications cannot be avoided by substituting statistics for physics."

The Berkeley Earth Land + Ocean Data
anomaly dataset shows little global average temperature increase since 1998.
It shows a warming from 1910 to 1940 of 0.45°C, then a pause to 1975, and a warming to 1998 of 0.55°C.
Note the upward steps caused by El Niño: The Pacific Climate Shift of 1976, the 1986/87/88 El Niño and the 1997/98 El Niño.
Then the 2009/10 El Niño and the 2014/15 strong El Niño.

Berkeley Earth Monthly Global Temperature Anomaly Index, 1850 to present.
Annual Average (black) with 95% uncertainty (grey) and Ten-Year Average (red).
[Berkeley Earth land values combined with interpolated HadSST ocean values]
[Above ice air temperatures used when and where sea ice is present]

The Berkeley Earth Surface Temperature Study has created a preliminary merged data set by combining 1.6 billion temperature reports
from 16 preexisting data archives.
Whenever possible, we have used raw data rather than previously homogenized or edited data.

The NOAA National Centers for Environmental Information (NCEI), formerly National Climatic Data Center (NCDC),
Annual Global Mean Surface Temperature Anomalies over Land & Ocean database
shows a 0.3°C cooling from 1880 to 1910, a 0.6°C warming to 1944, then a 0.4°C cooling to 1956.
Then it shows a 0.8°C warming from 1956 to 2005 and a 0.2°C cooling from 2005 to 2011.
Then a 0.25°C warming to 2015.
The warmest years shown are 1998, 2005, 2010 and 2014, all close to a 0.6°C anomaly. Then a 0.9°C peak anomaly in 2015.
The record minimum anomaly is close to -0.4°C in 1910.
Note the upward steps caused by El Niño: The Pacific Climate Shift of 1976, the 1986/87/88 El Niño and the 1997/98 El Niño.
Then the 2009/10 El Niño and the 2014/15 strong El Niño.

NCDC have introduced a new method for calculating state (but not national), temperatures in the USA.
The new method makes the past cooler, creating a false impression of present warming at the state level.
The national figures remain unaffected. This is because they were already being calculated under the new system,
creating a similar false impression.

Climatologists have long been aware of the poor state of global surface temperature records
and considerable effort has been put into adjusting the raw data to correct known errors and biases.

These adjustments are not insignificant.
For example it has been noted that in the temperature series prepared by NOAA for the USA,
the adjusted data exhibits a much larger warming trend than the raw data.

It has also been noted that over the years changes to the data have often tended to cool the early part of the record
and to warm more recent years, increasing the apparent warming trend.

Although the reasons for the adjustments that are made to the raw data are understood in broad terms,
for many of the global temperature series the details are obscure
and it has proved difficult for outsiders to determine whether they are valid and applied consistently.

For all these reasons, the global surface temperature records have been the subject of considerable and ongoing controversy.

In order to try to provide some clarity on the scientific issues,
the Global Warming Policy Foundation has invited a panel of experts to investigate and report on these controversies.

The panel features experts in physics, climatology and statistics and will be chaired by Professor Terence Kealey,
the former vice-chancellor of the University of Buckingham.

The U.S. Historical Climatology Network (USHCN)
is a high-quality moderate sized data set of monthly averaged maximum, minimum, and mean temperature
and total monthly precipitation developed to assist in the detection of regional climate change.
The USHCN is comprised of 1,221 high-quality stations from the U.S. Cooperative Observing Network within the 48 contiguous United States.

Currently all data adjustments in the USHCN are based on the use of metadata.
However station histories are often incomplete or changes that can cause a time series discontinuity,
such as replacing a broken thermometer with one that is calibrated differently, are not routinely entered into station history files.
Because of this we are developing another step in the processing that will apply a time series discontinuity adjustment scheme.
This methodology does not use station histories and identifies discontinuities in a station's time series
using a homogeneous reference series developed from surrounding stations.

The USHCN adjustment procedures are applied in stepwise fashion so that the effects from each adjustment have a cumulative effect.
The data set containing the final adjustment procedure (urbanization adjustments) also contains all of the previous adjustments.

The cumulative effect of all adjustments is approximately a one-half degree Fahrenheit [0.3°C]
warming in the annual time series over a 50-year period from the 1940's until the last decade of the century.

Note that these 0.3°C amount to half of the 0.6°C warming since 1940 in the NCDC temperature time series.
This should be valid for all of the global temperature time series that share data, adjustments and homogenization methods with NOAA-NCDC:
CRU, GISS, JMA and BEST.

The HadCRUT4 time series from the Met Office, the UK's National Weather Service,
shows the combined global land and marine surface annual temperature record from 1850 to 2014.
It shows a slight cooling from 2003 to 2013, and also a cooling of -0.1°C from 1940 to 1975,
with a minimum anomaly of some -0.5°C in 1910, after some -0.3°C of cooling from 1878.
Note the upward steps caused by El Niño: The Pacific Climate Shift of 1976, the 1986/87/88 El Niño and the 1997/98 El Niño.
Then the 2014/15 strong El Niño.
Note that 1998 is tied with 2005, 2010, and 2014 as the warmest year in this time series.
The accuracy with which we can measure the global average temperature of 2010 is around 0.1°C.
(See HadCRUT4 FAQ)

Global surface air temperature anomalies (-0.8 to +0.8°C) from 1850 to 2014 (1961-90 mean)
It shows an increment in average temperature from 1910 to 1941 of some 0.5°C
It shows an increment in average temperature from 1975 to 2003 of some 0.6°C

Calculating the global mean as the mean of the northern and southern hemisphere averages
helps prevent the value becoming dominated by the Northern hemisphere, where there are more observations.

The red bars show the global annual average near surface temperature anomalies from 1850 to 2014.
The error bars show the 95% uncertainty range on the annual averages.
The thick blue line shows the annual values after smoothing with a 21 point binomial filter.
The dashed portion of the smoothed line indicates where it is influenced by the treatment of the end points.
The thin blue lines show the 95% uncertainty on the smoothed curve.

The HadCRUT4 time series from the Climatic Research Unit, University of East Anglia (UK)
shows the combined global land and marine surface annual temperature record from 1850 to 2015.
It shows a 0.19°C warming since 1998, after warming some 0.6°C since 1975.
Note the upward steps caused by El Niño: The Pacific Climate Shift of 1976, the 1986/87/88 El Niño and the 1997/98 El Niño.
Then the 2014/15 strong El Niño.

Global surface air temperature anomalies (-0.6 to +0.8°C) from 1850 to 2015 (1961-90 mean)
It shows an increment in temperature from 1910 to 1941 of some 0.5°C,
an anomaly of +0.56°C in 2014 (equal warmest on record since 1998),
and a peak anomaly of +0.75°C in 2015 (warmest on record).
It also shows a cooling of -0.1°C from 1941 to 1975.

The value for 2014 [0.56°C], given uncertainties discussed in Morice et al. (2012), is not distinguishable from the years 2010 (0.555°C),
2005 (0.543°C) and 1998 (0.535°C).
From Info sheet #1
(February 2015, Dr. Phil Jones, Climatic Research Unit, .pdf)

The NASA Goddard Institute for Space Studies
GISS Surface Temperature Analysis
monthly anomaly dataset shows little temperature increase in their Global Land-Ocean Temperature Index five-year Running Mean since 1998.
It shows an increase of about 0.4°C from 1910 to 1940, a pause to 1970, and an increase of about 0.6°C from 1970 to 2002.
Note the upward steps caused by El Niño: The Pacific Climate Shift of 1976, the 1986/87/88 El Niño, and the 1997/98 El Niño.
Then the 2014/15 strong El Niño.

Due to the 0.1°C measurement uncertainty, the years 1998, 2005, 2010 and 2014 are not distinguishable in temperature.

The annual mean anomalies
Hadley Centre Central England Temperature (HadCET)
dataset shows a decline of some 0.5°C from 2003 to 2012 (red line, 10-year running mean).
2006 was the warmest year on record for the minimum HadCET database.
The mean, minimum and maximum datasets are updated monthly.
These daily and monthly temperatures
are representative of a roughly triangular area of the United Kingdom enclosed by Lancashire, London and Bristol.
The monthly series, which begins in 1659, is the longest available instrumental record of temperature in the world.
The daily series begins in 1772.

This paper is, as intended, a work in progress as a compilation of what's current and important
relative to the data sets used for formulating and implementing unprecedented policy decisions
seeking a radical transformation of our society and institutions.

Recent revelations from the Climategate whistleblower emails,
originating from the Climatic Research Unit at the University of East Anglia followed by the candid admission by Phil Jones,
the director of the CRU in a BBC interview
that his "surface temperature data are in such disarray they probably cannot be verified or replicated"
certainly should raise questions about the quality of global data.

Just as the Medieval Warm Period was an obstacle to those trying to suggest that today's temperature is exceptional,
and the UN and its supporters tried to abolish it with the "hockey-stick" graph,
the warmer temperatures in the 1930s and 1940s were another inconvenient fact that needed to be "fixed".

In each of the databases, the land temperatures from that period were simply adjusted downward,
making it look as though the rate of warming in the 20th century was higher than it was,
and making it look as though today's temperatures were unprecedented in at least 150 years.

Climategate has sparked a flurry of examinations of the global datasets not only at CRU, NASA, and NOAA,
but in various countries throughout the world.
Though the Hadley Centre implied their data was in agreement with other datasets and was thus trustworthy,
the truth is that other data centers and the individual countries involved
were forced to work with degraded data and appear to be each involved in data manipulation.

Should you believe NOAA/NASA/HADLEY rankings for month and year? Definitively NO!
Climate change is real, there are cooling and warming periods that can be shown to correlate nicely with solar and ocean cycles.
You can trust in the data that shows there has been warming from 1979 to 1998, just as there was warming around 1920 to 1940.
But there has been cooling from 1940 to the late 1970s and since 2001.
It is the long term trend on which this cyclical pattern is superimposed that is exaggerated.

These factors all lead to significant uncertainty and a tendency for overestimation of century-scale temperature trends.
An obvious conclusion from all findings above and the case studies that follow
is that the global data bases are seriously flawed and can no longer be trusted to assess climate trends.
And, consequently, such surface data should not be used for decision making.

U.S. Temperature trends show a spurious doubling due to NOAA station siting problems and post measurement adjustments:

An area and distance weighted analysis
of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends

"A reanalysis of U.S. surface station temperatures has been performed using the recently WMO-approved Siting Classification System
devised by METEO-France's Michel Leroy.
The new siting classification more accurately characterizes the quality of the location
in terms of monitoring long-term spatially representative surface temperature trends.
The new analysis demonstrates that reported 1979-2008 U.S. temperature trends are spuriously doubled,
with 92% of that over-estimation resulting from erroneous NOAA adjustments of well-sited stations upward."

"The new improved assessment, for the years 1979 to 2008,
yields a trend of +0.155°C per decade from the high quality sites,
a +0.248°C per decade trend for poorly sited locations,
and a trend of +0.309°C per decade after NOAA adjusts the data."
"This issue of station siting quality
is expected to be an issue with respect to the monitoring of land surface temperature
throughout the Global Historical Climate Network and in the BEST network."

"The new rating method employed finds that station siting does indeed have a significant effect on temperature trends."

Comparison - All Rated Stations in the Continental U.S.
What the compliant thermometers (Class 1&2) say: +.155°C/decade
What the non-compliant thermometers (Class 3,4,5) say: +.248°C/decade
What the NOAA final adjusted data says: +.309°C/decade

Class 1&2 (Compliant):
Heat sinks cover under 10% of area within a 30-meter radius of sensor,
under 1% of within 5 meters, and under 5% of an annulus from 5 to 10 meters.

Class 3 (Non-Compliant):
Heat sinks cover over 10% of area within a 30-meter radius but under 10% within 10 meters and under 5% within 5 meters.

Class 4 (Non-Compliant):
Heat sinks cover from 10% to 50% of area within a 10-meter radius of sensor but under 30% within 3 meters.

Class 5 (Non-Compliant):
Heat sinks cover over 50% or more area within a 10-meter radius of sensor or over 30% within 3 meters.

"Since 1979,
NOAA satellites have been carrying instruments which measure the natural microwave thermal emissions from oxygen in the atmosphere.
The intensity of the signals these microwave radiometers measure at different microwave frequencies
is directly proportional to the temperature of different, deep layers of the atmosphere.
Every month, John Christy and I update global temperature datasets that represent the piecing together of the temperature data
from a total of fourteen instruments flying on different satellites over the years."

"As of early 2011, our most stable instrument for this monitoring
was the Advanced Microwave Sounding Unit (AMSU-A) flying on NASA's Aqua satellite and providing data since late 2002."

"As of June 2013, the Advanced Microwave Sounding Unit (AMSU-A) flying on NASA's Aqua satellite has been removed from the processing
due to spurious warming and replaced by the average of the NOAA-15, NOAA-18, NOAA-19, and Metop-A AMSUs."

"Version 6 of the UAH MSU/AMSU global satellite temperature dataset is by far the most extensive revision of the procedures and computer code
we have ever produced in over 25 years of global temperature monitoring.
The two most significant changes from an end-user perspective are
(1) a decrease in the global-average lower tropospheric (LT) temperature trend from +0.140 C/decade to +0.114 C/decade
(Dec. '78 through Mar. '15);
and (2) the geographic distribution of the LT trends, including higher spatial resolution."
"In the early part of the record, Version 6 has somewhat faster warming than Version 5.6,
but then the latter part of the record has reduced (or even eliminated) warming,
producing results closer to the behavior of the RSS satellite dataset.
This is partly due to our new diurnal drift adjustment, especially for the NOAA-15 satellite.
Even though our approach to that adjustment is empirical, it is interesting to see that it gives similar results to the RSS approach,
which is based upon climate model calculations of the diurnal cycle in temperature."
From
Version 6.0 of the UAH Temperature Dataset Released: New LT Trend = +0.11 C/decade
(Roy W. Spencer, John R. Christy, and William D. Braswell. April 28th, 2015)

"The graphic shown below represents the latest update; updates are usually made within the first week of every month."
"Contrary to some reports, the satellite measurements are not calibrated in any way with
the global surface-based thermometer records of temperature.
They instead use their own on-board precision redundant platinum resistance thermometers (PRTs)
calibrated to a laboratory reference standard before launch."

The Version 6.0 global average lower tropospheric temperature (LT) anomaly for December, 2015 is +0.44 deg. C,
up from the November, 2015 value of +0.33 deg. C.

This makes 2015 the third warmest year globally (+0.27 deg. C) in the satellite record (since 1979),
behind 1998 (+0.48 deg. C) and 2010 (+0.34 deg. C).
Since 2016 should be warmer than 2015 with the current El Niño,
there is a good chance 2016 will end up as a record warm year...it all depends upon how quickly El Niño wanes later in the year.

"UAH v5.5 Global Temp. Update for September 2012: +0.34 deg. C."
"As discussed in my post from yesterday, the spurious warming in Aqua AMSU channel 5
has resulted in the need for revisions to the UAH global lower tropospheric temperature (LT) product."
"Rather than issuing an early release of Version 6, which has been in the works for about a year now, we decided to do something simpler:
remove Aqua AMSU after a certain date, and replace it with the average of NOAA-15 and NOAA-18 AMSU data.
Even though the two NOAA satellites have experienced diurnal drifts in their orbits,
we have found that those drifts are in opposite directions and approximately cancel. (The drifts will be corrected for in Version 6.0)."
"The new interim dataset, Version 5.5, has a September, 2012 global lower tropospheric temperature anomaly of +0.34 deg. C."
See
UAH V5.5 Global Temp. Update for September, 2012: +0.34 deg. C
(October 5th, 2012)
(Roy Spencer, Ph. D., Principal Research Scientist at the University of Alabama in Huntsville - UAH)

For those tracking our
daily updates of global temperatures at the Discover website,
remember that only 2 "channels" can be trusted for comparing different years to each other,
both being the only ones posted there from NASA's AQUA satellite:
1) only ch05 [14,000 ft/4.4 Km/600 mb] data should be used for tracking tropospheric temperatures,
2) the global-average "sea surface" temperatures are from AMSR-E on AQUA, and should be accurate.
["Channels" 5 and 9 allow comparing against the 1979-1998 average]

The temperature trend for RSS MSU lower tropospheric global mean from 1979 to 2002 was 1.46°C per century.
The temperature trend for RSS MSU lower tropospheric global mean from 2002 to 2014.92 was -0.59°C per century.
Note that global warming stopped in 2002 for the REMSS record, after peaking in 1998.

London, 15 March: A new report written by Dr. David Whitehouse and published today by the Global Warming Policy Foundation
concludes that there has been no statistically significant increase in annual global temperatures since 1997.

After reviewing the scientific literature
the report concludes that the standstill is an empirical fact and a reality that challenges current climate models.
During the time that the Earth's global temperature has remained static
the atmospheric composition of carbon dioxide has increased from 370 to 390 ppm.

"The standstill is a reality and is not the result of cherry-picking start and end points.
Its commencement can be seen clearly in the data, and it continues to this day", said Dr. David Whitehouse, the author of the new report.

The report shows that the temperature standstill has been a much discussed topic in peer-reviewed scientific literature for years,
but that this scientific debate has neither been followed by most of the media, nor acknowledged by climate campaigners,
scientific societies and prominent scientists.

The report also surveys how those few journalists who have looked at the issue have been reporting the standstill,
with many far too ready to dismiss it or lacking a sense of journalistic inquiry, preferring to report squabbles rather than the science.

"If the standstill continues for a few more years it will mean that no one who has just reached adulthood, or younger,
will have witnessed the Earth get warmer during their lifetime", said the report's author, Dr. David Whitehouse.

In his foreword, Lord Turnbull, former Cabinet Secretary and Head of the Home Civil Service, commented:

"Dr. Whitehouse is a man who deserves to be listened to.
He has consistently followed an approach of examining observations rather than projections of large scale computer models,
which are too often cited as 'evidence'.
He looks dispassionately at the data, trying to establish what message it tells us, rather than using it to confirm a pre-held view."

The term 'global' should mean that every region shows the same trend.
The term 'global temperature' should be well defined and the relevant data made public.

Much attention has been given to the possible effect of the increase of 'heat retention gases' in our atmosphere;
the atmospheric 'greenhouse effect', and its possible cause.
There are a number of geologists and other Earth-science researchers that have concluded that
the atmospheric 'greenhouse effect' is real and its increase has to be man-made;
by CO2 in particular, product of the recent industrial proliferation and its fossil-fuel energy demands.

But carbon dioxide is plant food, they use it to produce their energy and store it in their bodies.
All animals produce it when they breathe,
it is present in the interior of the Earth and surfaces during volcanic eruptions.
The main store of available CO2 is in the oceans, which give it off into the atmosphere when their temperatures rise.

Other Earth-scientists have concluded that because, long before today's industrial age,
there have been elevated levels of temperatures and 'heat retention gases' in our atmosphere repeating in a cyclical fashion,
the Earth must now be doing what it always has done; warming and then cooling,
the polar ice caps advancing and then retreating cyclically under the influence of the Sun, Earth's orbit and the ocean currents.
They point out to the most recent global warming period around the middle ages when the climate warmed to include northern Europe
having mild winters and very warm summers: the "Medieval Warm Period", from approximately A.D. 1000 to A.D. 1350.
Then the Earth's climate was in a cool period from A.D. 1400 to about A.D. 1860, named the "Little Ice Age".
The "Holocene Maximum" was the warmest period in human history, from some 7,500 to some 4,000 years ago.
Then the Earth cooled again till around year 1000.

Beginning about 18,000 years ago the Earth started warming up. Some 8,000 years ago the bridge between Asia and North America was submerged.
During the past 750,000 years of Earth's history, Ice Ages have occurred at regular intervals, of approximately 100,000 years each.

The Earth seems to be always oscillating between a cooling period and a warming period.
This would indicate that the climate could be an oscillator regulated by both negative and positive feedbacks, cycling between two states.

There is some man-made effect taking part in the whole CO2 abundance
(Mauna Loa: 316 ppm in 1960, 325 ppm in 1970, 338 ppm in 1980, 354 ppm in 1990, 370 ppm in 2000, 390 ppm in 2010).
But, according to the IPCC,
CO2 would be just a part of the alleged driving force in the warming phase of the climate oscillation,
most would be from H2O (clouds and water vapor). [Assuming positive net feedback]

In general, researchers find strong seasonal CO2 fluctuations throughout the Northern Hemisphere
and weaker fluctuations near the equator and in the Southern Hemisphere.
As plants begin to photosynthesize in the spring and summer,
they consume CO2 from the atmosphere and use it as a carbon source for growth and reproduction.
This causes the decrease in CO2 levels that begins every year in May.
Once winter arrives, plants save energy by decreasing photosynthesis.
With less photosynthesis, the dominant process is the exhalation of CO2 by the total ecosystem,
including bacteria, plants, and animals.

The "greenhouse effect" would be crucial to the survival of life on Earth,
because without it our present global average temperature of some 15°C (59°F) would be instead of some -18°C (-0.4°F).

"When global warming is discussed, the warming effect of greenhouse gases is obviously of prime interest.
But it is seldom if ever mentioned that about 50% of the surface warming influence of greenhouse gases
has been short-circuited by the cooling effects of weather."

"Even if water vapor feedback is positive,
an increase in the solar shading effect of clouds (negative cloud feedback) could more than overwhelm the positive water vapor feedback,
leading to little net warming."

"While it seems rather obvious intuitively that a warmer world will have more atmospheric water vapor,
and thus positive water vapor feedback, I've listed the first 5 reasons why this might not be the case."

Dr. Ferenc Miskolczi, a former contract researcher for NASA's Langley Research Center,
discovered a self-regulating mechanism, or "constant", that keeps Earth's greenhouse gases in equilibrium.
According to his equilibrium theory,
this constant cannot be altered by increases in emissions of CO2 or other atmospheric gases such as methane.

"The only thing my theory is telling us is that the nature of the greenhouse effect is such,
that under the conditions we have here on Earth,
the atmosphere will maximize its cooling by keeping its infrared optical depth - or infrared absorption - at a preferred critical value."

"With relatively simple computations using NOAA's annual mean temperature, H20 and CO2 time series,
I have shown that in the last 61 years, despite a 30 percent increase in the atmospheric CO2 concentration,
the cumulative atmospheric absorption of all greenhouse gases has not been changed and has remained constant.
There is no runaway greenhouse effect."

"The Earth's atmosphere differs in essence from that of Venus and Mars.
Our atmosphere is not totally cloud-covered, as is Venus: globally, about 40% of the sky is always clear.
Also we have huge ocean surfaces that serve as a practically unlimited reservoir of water vapor for the air."

"With the help of these two conditions, the Earth's atmosphere attains what the other two planets cannot:
a constant, maximized, saturated greenhouse effect,
so that adding more greenhouse gases to the mix will not increase the magnitude of the greenhouse effect and, therefore,
will not cause any further "global warming"."

"If Miskolczi is correct that the amount of thermal radiation emitted by an object (or layer of the atmosphere)
ALWAYS equals the amount absorbed, this necessarily implies something that no one else I know of believes:
that INFRARED RADIATIVE FLOWS BETWEEN IR ABSORBERS AND EMITTERS CANNOT CHANGE THEIR TEMPERATURE."

The Thermostat Hypothesis: How clouds and thunderstorms control the Earth's temperature

The Thunderstorm Thermostat Hypothesis is that tropical clouds and thunderstorms actively regulate the temperature of the earth.
This keeps the earth at an equilibrium temperature regardless of changes in the forcings.

Several kinds of evidence are presented to establish and elucidate the Thermostat Hypothesis
- historical temperature stability of the Earth, theoretical considerations, satellite photos,
and a description of the equilibrium mechanism.

"In this report*, we present three global surface climate records,
created from available data by NASA Goddard Institute for Space Studies [GISS], NOAA National Climatic Data Center [NCDC],
and the cooperative project of the U.K. Hadley Centre and the Climatic Research Unit [CRU] of the University of East Anglia (HadCRUT2v)."
These three analyses are led by Tom Karl (NCDC), Jim Hansen (GISS) and Phil Jones (CRU).

"... Our global temperature series tallies with those of other, completely independent,
groups of scientists working for NASA and the National Climate Data Centre in the United States, among others.
Even if you were to ignore our findings, theirs show the same results.
The facts speak for themselves; there is no need for anyone to manipulate them."

The differences between the three global surface temperatures that occur
are a result of the analysis methodology as used by each of the three groups. They are not "completely independent".
"The data sets are distinguished from one another by differences in the details of their construction."
"Since the three chosen data sets utilize many of the same raw observations, there is a degree of interdependence."
"The best estimate that has been reported is that 90-95% of the raw data in each of the analyses is the same."

The 1997 Conference on the World Climate Research Programme
to the Third Conference of the Parties of the United Nations Framework Convention on Climate Change
concluded that the global capacity to observe the Earth's climate system is inadequate and is deteriorating worldwide:
"Without action to reverse this decline and develop the Global Climate Observation System,
the ability to characterize climate change and variations over the next 25 years will be even less
than during the past quarter century."

There has clearly been some warming in recent decades, most notably 1979 to 1998.
However the global surface station based data is seriously compromised by major station dropout.
There has been a clear bias towards removing higher elevation, higher latitude and rural stations.
The data suffers contamination by urbanization and other local factors such as land-use/land-cover changes, and improper sitting.
There is missing data and uncertainties in ocean temperatures. These factors all lead to overestimation of temperatures.

"A simple graph by Canadian statistician, Ross McKitrick puts this in picture form.
His graph shows that when many stations were selectively and suddenly eliminated from world temperature records,
reported global temperature immediately and instantly appeared to step up alarmingly
to higher levels-in the 1990's and 2000's."

"Globally, 12,000 to 14,000 stations during 1970-1989 were reduced to less than 8,000 in year 1991,
further to less than 6,000 in year 2000 and to 1,500 now and mainly located at airports.
Stations were relocated from previous sites in forests and rural areas to urban sites.
Measurements in cold Siberia were eliminated after the collapse of Soviet [Union].
Weather stations were moved from north to south, from high altitudes to low altitudes,
all giving higher temperatures."

"90% of stations give 1-2°C too high temperatures, i.e. more than IPCC claim for AGW (Anthropogenic Global Warming).
During 1950 to 1989 with 12-14,000 stations, average temperature is around 10.0°C and 1990 to 2000 temperature is 11-12°C,
average around 11.5°C thus an increase of 1.5°C.
90% of all air temperature measurements are taken over land,
while land covers only 30% of the planet and the oceans cover 70%."

"The supposed gold standard in surface temperature data is that produced by Univ. of East Anglia, the so-called CRUTem3 dataset.
There has always been a lingering suspicion among skeptics that some portion of this IPCC official temperature record
contains some level of residual spurious warming due to the urban heat island effect.
Several published papers over the years have supported that suspicion."

"The Urban Heat Island (UHI) effect is familiar to most people:
towns and cities are typically warmer than surrounding rural areas due to the replacement of natural vegetation with manmade structures.
If that effect increases over time at thermometer sites,
there will be a spurious warming component to regional or global temperature trends computed from the data."

"Here I will show based upon unadjusted International Surface Hourly (ISH) data archived at NCDC
that the warming trend over the Northern Hemisphere, where virtually all of the thermometer data exist,
is a function of population density at the thermometer site."

"Depending upon how low in population density one extends the results,
the level of spurious warming in the CRUTem3 dataset ranges from 14% to 30% when 3 population density classes are considered,
and even 60% with 5 population classes."

"I find the above results to be quite compelling evidence for what Anthony Watts, Pat Michaels, Ross McKitrick, et al.,
have been emphasizing for years: that poor thermometer siting has likely led to spurious warming trends,
which has then inflated the official IPCC estimates of warming.
These results are roughly consistent with the McKitrick and Michaels (2007) study
which suggested as much as 50% of the reported surface warming since 1980 could be spurious."

See also Urban Heat Island Effect
(Professor John Christy from the University of Alabama in Huntsville. Video 06:43 ClimateClips.com)

Surface temperature uncertainty, quantified:

Sensor measurement uncertainty has never been fully considered in prior appraisals of global average surface air temperature.
The estimated average ±0.2 C station error has been incorrectly assessed as random,
and the systematic error from uncontrolled variables has been invariably neglected.
The systematic errors in measurements from three ideally sited and maintained temperature sensors are calculated herein.
Combined with the ±0.2 C average station error,
a representative lower-limit uncertainty of ±0.46 C was found for any global annual surface air temperature anomaly.
This ±0.46 C reveals that the global surface air temperature anomaly trend from 1880 through 2000
is statistically indistinguishable from 0 C,
and represents a lower limit of calibration uncertainty for climate models
and for any prospective physically justifiable proxy reconstruction of paleo-temperature.
The rate and magnitude of 20th century warming are thus unknowable,
and suggestions of an unprecedented trend in 20th century global air temperature are unsustainable.

"In 1988 the scientist James Hansen of the National Aeronautics and Space Administration (NASA)
announced to Congress (USA) and the world, "Global warming has begun".
He went on to report that, at least to his satisfaction,
he had seen the "signal" in the climate noise and that the earth was destined for global warming,
perhaps in the form of a runaway greenhouse effect.
Hansen later revised his remarks,
but his statement remained the starting point of widespread concerns over global warming.
That same year the Intergovernmental Panel on Climate Change (IPCC) was formed as a joint program
of the United Nations Environmental Program, the World Meteorological Organization,
and the International Congress of Scientific Unions.
It has a mandate to prepare regular assessments of what is known and what should be done about anthropogenic climate change."

Retired senior NASA atmospheric scientist, Dr. John S. Theon, the former supervisor of James Hansen,
has now publicly declared himself a skeptic and declared that Hansen "embarrassed NASA".
He violated NASA's official agency position on climate forecasting
("we did not know enough to forecast climate change or mankind's effect on it").
Hansen thus embarrassed NASA by coming out with his claims of global warming in 1988 in his testimony before Congress.
[January 15, 2009]

"More than 1,000 dissenting scientists from around the globe have now challenged man-made global warming claims
made by the United Nations Intergovernmental Panel on Climate Change (IPCC) and former Vice President Al Gore."

"49 former NASA scientists and astronauts sent a letter to NASA Administrator Charles Bolden last week
admonishing the agency for it's role in advocating a high degree of certainty
that man-made CO2 is a major cause of climate change while neglecting empirical evidence that calls the theory into question."

Just how good are climate models at predicting regional patterns of climate change?
I had occasion to survey this literature as part of a recently completed research project on the subject.
The simple summary is that, with few exceptions, climate models not only fail to do better than random numbers,
in some cases they are actually worse.

"The way the problem is customarily presented to the public is seriously misleading.
The public is led to believe that the carbon dioxide problem has a single cause and a single consequence.
The single cause is fossil fuel burning, the single consequence is global warming.
In reality there are multiple causes and multiple consequences.
The atmospheric carbon dioxide that drives global warming is only the tail of the dog.
The dog that wags the tail is the global ecology: forests, farms and swamps, as well as power-stations, factories and automobiles.
And the increase of carbon dioxide in the atmosphere has other consequences that may be at least as important as global warming
- increasing crop yields and growth of forests, for example.
To handle the problem intelligently, we need to understand all the causes and all the consequences."

"The models solve the equations of fluid dynamics,
and they do a very good job of describing the fluid motions of the atmosphere and the oceans.
They do a very poor job of describing the clouds, the dust, the chemistry and the biology of fields and farms and forests.
They do not begin to describe the real world that we live in."

"The dramatic and threatening environmental changes announced for the next decades
are the result of models whose main drive factor of climatic changes is the increasing carbon dioxide in the atmosphere.
Although taken as a premise, the hypothesis does not have verifiable consistence."

"CO2 changes are closely related to temperature.
Warmer seasons or triennial phases are followed by an atmosphere that is rich in CO2,
reflecting the gas solving or exsolving from water, and not photosynthesis activity."

"Monthly changes have no correspondence
as would be expected if the warming was an important absorption-radiation effect of the CO2 increase.
The anthropogenic wasting of fossil fuel CO2 to the atmosphere shows no relation with the temperature changes even in an annual basis.
The absence of immediate relation between CO2 and temperature is evidence that rising its mix ratio in the atmosphere
will not imply more absorption and time residence of energy over the Earth surface.
This is explained because band absorption is nearly all done with historic CO2 values.
Unlike CO2, water vapor in the atmosphere is rising in tune with temperature changes, even in a monthly scale.
The rising energy absorption of vapor is reducing the outcoming long wave radiation window and amplifying warming regionally
and in a different way around the globe."

"The main conclusion one arrives at the analysis is that CO2 has not a causal relation with global warming
and it is not powerful enough to cause the historical changes in temperature that were observed."

Vatican City, April 27, 2007 (Zenit.org).-
Scientists might not have human behavior to blame for global warming,
according to the president of the World Federation of Scientists.

Antonio Zichichi, who is also a retired professor of advanced physics at the University of Bologna,
made this assertion today in an address delivered to an international congress sponsored by the Pontifical Council for Justice and Peace.

The conference, which ends today, is examining "Climate Change and Development".

Zichichi pointed out that human activity has less than a 10% impact on the environment.

He also cited that models used by the Intergovernmental Panel on Climate Change (IPCC) are incoherent
and invalid from a scientific point of view.
The U.N. commission was founded in 1988 to evaluate the risk of climate change brought on by humans.

Zichichi, who is also member of the Pontifical Academy of Sciences,
showed that the mathematical models used by the IPCC do not correspond to the criteria of the scientific method.

He said that the IPCC used "the method of 'forcing' to arrive at their conclusions that human activity produces meteorological variations".

The physicist affirmed that on the basis of actual scientific fact
"it is not possible to exclude the idea that climate changes can be due to natural causes",
and that it is plausible that "man is not to blame".

To that end, Zichichi explained how the motor of meteorology depends on natural phenomena.
He gave as an example the
"energy sent by the Sun and volcanic activity that spits out lava and enormous quantities of substances in the atmosphere".

He also reminded those present that 500,000 years ago the Earth lost the North and South Poles four times.
The poles disappeared and reformed four times, he said.

Zichichi said that in the end he is not convinced that global warming is caused
by the increase of emissions of "greenhouse gases" produced through human activity.

Climate changes, he said, depend in a significant way on the fluctuation of cosmic rays.

Meteorology and Climate: Problems and Expectations
Problems with the Mathematical Modelling for "Climatic Changes"

In fact, the present mathematical models are far from being satisfactory.
The public at large wishes to know if it is true that human activities are creating a huge perturbation
of the climate characteristics of our globe.

To answer this question, the United Nations instituted a permanent committee composed of 2500 scientists from the world over,
the IPCC, which has been at work for the last few years and has led the public to believe - as said before -
that science has understood all about climate.
If that was true, climatologically, the destiny of our planet should be free of uncertainties and under the rigorous control of science.
But it's not this way.

When von Neumann, half a century ago, started it all, the mathematical models describing the climate were two-dimensional.
It was the brilliant collaborator of von Neumann, the very young Tsung Dao Lee, Fermi's favourite pupil and a Nobel Laureate,
who introduced the 'third dimension' in the mathematics of climate.
Without this third dimension, 'turbulence', the fundamental property of all models, could not exist.

The father of 'turbulence' participated in the Erice Seminars dedicated to the mathematical models used by the ICCP and found them wanting.
We're talking here of mathematical models whose results have consequences costing billions of dollars
and involve the responsibility of all the governments in the world.

It is necessary to bring these basic themes back to the scientific laboratories where they belong,
taking them away from the hands of those who use them to satisfy ambitions that have nothing to do with scientific truth.
The public at large wishes to know what conclusions, based on scientific rigour,
can result from the analysis of the measurements already taken.

"The popular vision of an approaching apocalypse caused by global warming has no scientific foundation, says Patrick J. Michaels.
Those who warn of a catastrophic greenhouse effect -- such as former Vice President Al Gore --
can justify neither their fears not their blueprints for dramatically interfering with the U.S. and world economies.
Sound and Fury criticizes "science by sound and bite"
and congressional show trials complete with testimony that has not been peer-reviewed according to scientific standards.
Michaels shows that the slight warming over the last century has been far less than the prophets of the apocalypse would expect
-- throwing the reliability of their computer climate models into doubt --
that most of it happened before industry's massive carbon dioxide emissions began,
and that most of the warming is at night, when it produces benign effects such as longer growing seasons.
In other words, the warming that has resulted from natural climatic processes is beneficial."

"Patrick J. Michaels
is Distinguished Senior Fellow in the School of Public Policy at George Mason University
and senior fellow in environmental studies at the Cato Institute.
He is past president of the American Association of State Climatologists,
winner of the American Library Association's worldwide competition for public service writing,
and an author of the 2003 climate science "Paper of the Year", awarded by the Association of American Geographers."

NASA's Earth Observing System (EOS) Aqua Satellite was launched on May 4, 2002.
The Advanced Microwave Scanning Radiometer - (AMSR-E) was one of the six sensors aboard Aqua.
AMSR-E was developed by the National Space Agency of Japan (JAXA).
On October 4, 2011, AMSR-E ended 9+ Years of global observations due to mechanical failure.
See
AMSR-E Ends 9+ Years of Global Observations
(Roy W. Spencer, Ph. D., Principal Research Scientist at the University of Alabama in Huntsville - UAH).
See AMSR-E problems; maps not updated since Oct. 4 '11.

See IPCC and Antarctica
(Professor John R. Christy, The University of Alabama in Hunstville. Video 03:49 ClimateClips.com)

Daily AMSR2 sea ice maps:
The new satellite "Shizuku" (GCOM-W1) that carries AMSR2 (the successor of AMSR-E) has been launched successfully on May 18, 2012.
It has been delivering data since August 2012.
On Januray 25, 2013, the calibrated brightness temperature data have been released to the public.
Starting on 26 January 2013, we produce daily sea ice concentration maps from these data.
Note that thorough calibration of the AMSR2/ARTIST Sea Ice (ASI) data has not been finished yet.

Note that a maximum extent of
Arctic
sea ice was reached in 1979 and for the
Antarctic in 2014.
The minimum extent of sea ice for the Arctic was in 2007 and for the Antarctic it was in 1993.

The Arctic sea ice approximately triples in size during a year, from summer to winter.
Arctic sea ice extent is directly dependent on winds and currents, not just on temperatures.

Daily updated sea ice extents from the Special Sensor Microwave Imager/Sounder (SSMIS)
on the Defense Meteorological Satellite Program (DMSP) showing averages for 1981-2010 (solid grey line),
the graphs show time series for each hemisphere (solid blue line),
and ±2 standard deviations (light gray area), which serves as an estimate for 95% of the expected range of natural variability.
The graphs also include a line for a selected earlier year, for comparison (dashed green line).
Total area, in millions of square kilometers, of at least 15% floating ice concentration:

Monthly Sea Ice Extent Anomaly Graphs, since 1979:
These graphs show monthly ice extent anomalies plotted as a time series of percent difference between the extent for the month in question
and the mean for that month based on the January 1981 to December 2010 data.
The anomaly data points are plotted as plus signs and the trend line is plotted with a dashed grey line, and its slope is calculated.

Arctic Sea Ice extension measurements started in 1953, and have been decreasing since peaking in 1970.
Satellite measurements started in 1979.
The years 2007 and 2012 set new record lows, and the 2012 minimum dropped to 3.62 million square kilometers.
The minimum extents for 2013 through 2015 were higher than in 2012, but still below the long-term average.
See SOTC: Sea Ice (National Snow and Ice Data Center - NSIDC)

In contrast to fresh water, the salt in ocean water causes the density of the water to increase as it nears the freezing point,
and very cold ocean water tends to sink.
As a result, sea ice forms slowly, compared to freshwater ice,
because salt water sinks away from the cold surface before it cools enough to freeze.
Sea ice is formed when ocean water freezes. Because the oceans are salty, this occurs at about -1.8ºC (28.8ºF).
Because oceans are so deep, it takes longer to reach the freezing point, and generally,
the top 100 to 150 meters (300 to 450 feet) of water must be cooled to the freezing temperature for ice to form.
See All About Sea Ice (National Snow and Ice Data Center - NSIDC)

The Arctic Sea Ice Volume Anomaly measurements have been recovering since the minimum of 2012 (see
graphic).

"The April 2015 volume was 24,200 km3, close to the April 2010 value.
The April 2015 volume was 26% below the maximum April ice volume in 1979 and 13% below the 1979-2014 mean,
and about 1 standard deviation above the long term trend.
The June 2015 volume was 18.500 km3, 900 km3 above the 2014 June value."

"September 2015 volume was 5,800 km3, 1,100 km3 below the 2014 value.
September volume was 65% below the maximum September ice volume in 1979,
49% below the 1979-2014 mean, and almost exactly on the long term trend line.
September 2015 had the 5th lowest ice volume on record, while it was the 4th lowest in extent.
Ice loss during the 2015 melt season was remarkable considering that ice volume through May of 2015
was still considerably larger than the corresponding 2014 values.
Volume loss over the 2015 melt season was the 3rd largest on record."

"Average ice thickness in September 2015 over the PIOMAS domain was just slightly below the 2014 value,
suggesting that the volume loss relative to 2014 came largely from areas of thinner ice."

The Antarctic sea ice approximately doubles in size during a year, from summer to winter, due to the sea ice that forms around the coasts.
During the winter months it becomes so cold that the sea surrounding Antarctica freezes for hundreds of km off-shore.
This ice breaks up to form pack-ice which, under the action of winds and currents, is constantly changing form and distribution.
About 98% of Antarctica is permanently covered by the Antarctic ice sheet, the largest single mass of ice on Earth,
a sheet of ice averaging at least 1.6 km (1.0 mi) thick.
It covers an area of about 14.6 million km2 and contains 25-30 million km3 of ice.
The continent has about 90% of the world's ice (and thereby about 70% of the world's fresh water).
In some places the ice is over 4 km deep.
The ice flows continuously from the high elevations to the sea, breaking off to form massive icebergs.
The amount of precipitation in Antarctica is so small that it is classed as a desert region (polar desert).

Global warming theory predicts that rising levels of CO2 will gradually warm the air and cause an increasing loss of sea ice.
As temperatures rise, ice nearer the equator was predicted to be the first to disappear
and over the coming decades ice closer to the poles would be the last to melt. However that is not the reality we are now observing.
Antarctic sea ice is mostly located outside the Antarctic Circle and should be the first to melt due to global warming theory.
Yet Antarctic sea ice has been increasing and expanding towards the equator contradicting all the models.

Sea level trends of the world's oceans are shown in figure a.
The two curves represent the overall sea level variations
averaged within the Southern Ocean south of 40° S and global ocean, respectively.
The most remarkable feature is a large regional difference in sea level trend.
The North Pacific and equatorial Pacific exhibit the most spectacular east-west seesaw pattern,
with strong positive trends in the western side and strong negative trends in the eastern side.
The Atlantic Ocean shows the most homogeneous field and is associated with weak positive trends in general,
while in the Indian Ocean negative trends are dominant,
except for positive trends in the Indonesian throughflow region and west of Australia.
The Southern Ocean south of 40° S shows noticeable positive trends in most places,
with one notable exception in the Pacific Antarctic Basin, where there is a broad region of strong negative trends.

The Southern Ocean experienced a sharp rise in sea level during the 1997-1998 ENSO period (see figure b).
A similar sea level rise is also observed for the global ocean,
although the amplitude there is only half that of the Southern Ocean.
Over the 1993-2000 period, the mean sea level trend of the Southern Ocean is estimated at 2.34±0.34 mm/yr,
compared to 1.21±0.15 mm/yr for the global ocean.
The latter value is close to the lower bound of the IPCC (Intergovernmental Panel on Climate Change)
global trend range over the last century (1-2 mm/yr)
and is also not significantly different from the estimate of Cazenave et al. [1998]
over the period 1993-mid 1997 (1.3±0.15 mm/yr).
Globally, no dramatic sea level rising trend resembling the exponential concentration of CO2 in the atmosphere
is observed during the past century.

(a) Map showing sea level trends (mm/yr) calculated at each T/P crossover point over the period 1993-2000.
Before the trend calculation, mean seasonal variations were eliminated from the monthly time series at each data point
and then low-pass filtered using a Gaussian filter with a cut-off at six months.
(b) Mean sea level variations (mm) averaged for the whole Southern Ocean south of 40° S (green)
and for the entire global ocean (red). Area-dependent weights were applied during the averaging process.

"Long-term mean sea level change is a variable of considerable interest in the studies of global climate change."

"Since August 1992 the satellite altimeters have been measuring sea level on a global basis with unprecedented accuracy.
The TOPEX/POSEIDON (T/P) satellite mission provided observations of sea level change from 1992 until 2005.
Jason-1, launched in late 2001 as the successor to T/P,
continues this record by providing an estimate of global mean sea level every 10 days with an uncertainty of 3-4 mm."

Since 1993, measurements from the TOPEX and Jason series of satellite radar altimeters have allowed estimates of global mean sea level.
These measurements are continuously calibrated against a network of tide gauges.
When seasonal and other variations are subtracted, they allow estimation of the global mean sea level rate.

According to the Sea Level Research Group, University of Colorado, the mean rate of global sea level rise is 3.3±0.4 mm/yr.
[Includes a
"global mean glacial isostatic adjustment (GIA)"
correction of 0.3 mm/yr. The GIA uncertainty is at least 50 percent.]

The correction for glacial isostatic adjustment (GIA)
accounts for the fact that the ocean basins are getting slightly larger since the end of the last glacial cycle.
GIA is not caused by current glacier melt,
but by the rebound of the Earth from the several kilometer thick ice sheets that covered much of North America and Europe
around 20,000 years ago.

There is a strong correlation between the Global Mean Sea Level (GMSL) and the Multivariate ENSO Index (MEI),
with the GMSL often lagging changes in the MEI:

Global mean sea level variations from TOPEX, Jason-1, and Jason-2 with respect to 1993-2002 mean, plotted every 10 days (color-coded dots).
The red line is a linear fit of the smoothed variations (60-day Hanning filter)
with GIA applied and with annual and semi-annual signals removed from 1993.0 to 2015.33
showing a global mean sea level rise estimate of 3.21±0.4 mm/yr.

A series of satellite missions that started with TOPEX/Poseidon (T/P) in 1992
and continued with Jason-1 (2001-2013) and Jason-2 (2008-present)
estimate global mean sea level every 10 days with an uncertainty of 3-4 mm.

Clear observational measurements in the field indicate that sea level is not rising in the Maldives,
Bangladesh, Tuvalu, Vanuatu, and French Guyana.
From the coasts of French Guyana and Surinam there is a very excellent sea level record covering multiple 18.6-year tidal cycles.
It exhibits variations around a stable zero level over the last 50 years.
For the same area, satellite altimetry gives a sea level rise of 3.0 mm/year.
The tide-gauge at Korsør in the Great Belt (the strait between the main Danish islands of Zealand and Funen), for example,
is located at the hinge between uplift and subsidence for the last 8,000 years.
This tide-gauge shows no sea level rise in the last 50-60 years.

The spectrum of proposed rates of present-day sea level changes ranges from 0.0 mm/year,
according to observational facts from a number of key sites all over the world,
to 3.2 mm/year, according to calibrated satellite altimetry.

Tide-gauges were installed at harbor constructions to measure the changes in tidal level and long-term sea level changes.
Most tide-gauges are installed on unstable harbor constructions or landing piers.
Therefore, tide-gauge records are bound to exaggerate sea level rise.
The IPCC authors take the liberty to select what they call "representative" records
for their reconstruction of the centennial sea level trend.
This, of course, implies that their personal view - that is, the IPCC scenario laid down from the beginning of the project -
is imposed in the selection and identification of their "representative" records.
We start to smell another "sea-level-gate".

The mean of all the 159 NOAA sites gives a rate of 0.5 mm/year to 0.6 mm/year.
A better approach, however, is to exclude those sites that represent uplifted and subsided areas.
This leaves 68 sites of reasonable stability (still with the possibility of an exaggeration of the rate of change, as discussed above).
These sites give a present rate of sea level rise in the order of 1.0 (±1.0) mm/year.
This is far below the rates given by satellite altimetry, and the smell of a "sea-levelgate" gets stronger.

Renowned oceanographic expert Nils-Axel Mörner has studied sea level and its effects on coastal areas for some 45 years.
Recently retired as director of the Paleogeophysics and Geodynamics Department at Stockholm University,
Mörner is past president (1999-2003) of the INQUA Commission on Sea Level Changes and Coastal Evolution,
and leader of the Maldives Sea Level Project.

JPL admits that satellite measurement of the Earth has issues because a stable Terrestrial Reference Frame
was never established for any of the satellite programs.
It's like setting out to do a terrestrial survey without having an accurate benchmark first.
The lack of a stable Terrestrial Reference Frame puts all of the space-based geodetic data into question.

Hollywood and the media have helped create a popular perception that humans are causing dramatic sea level rises by man-made global warming.
This perception comes from an exaggeration of more modest, though still dramatic,
computer model predictions of 1-2 metre rises by the end of the 21st century.
However, the actual experimental data shows, at most, a slow and modest increase in sea levels,
which seems completely unrelated to CO2 concentrations.

The main estimates of long-term sea level changes are based on data from various tidal gauges located across the globe.
These estimates apparently suggest a sea level rise of about 1 to 3mm a year since records began.
This works out at about 10-30cm (4-12 inch) per century, or about a 1 foot rise every 100-300 years,
hardly the scary rates implied by science fiction films like The Day After Tomorrow (2004) or Waterworld (1995).

Importantly, the rate still seems to be about the same as it was at the end of the 19th century,
even though carbon dioxide emissions are much higher now than they were during the 19th century.

Moreover, there are a number of problems in using the tidal gauge data which have not been resolved yet.
So, despite claims to the contrary, it is still unclear if there has actually been any long term trend!
In this essay, we will summarise what is actually known about current sea level trends.

Willem de Lange and Bob Carter provide a succinct summary of the primary scientific issues
relevant to devising cost-effective policies regarding sea-level change and show that adaptation is more cost-effective than mitigation.

The long-term tide-gauge data record a 20th century average global sea-level rise of about +1-2 mm/y.
It is established by many studies, too, that over the last 150 years global sea-level has been rising at an average rate of about 1.8 mm/y,
which is inferred to represent the slow continuation of a melting of the ice sheets that began about 17,000 years ago.

Current global sea-level policy, supported by many governments,
is to reduce the quantity of carbon dioxide in the atmosphere in order to slow a global warming that is apparently no longer happening,
in a vain attempt to reduce the rate of global sea-level rise.
This policy attempts to moderate a theoretical environmental variable, ignores local sea-level and coastal management realities,
is ineffectual in significantly reducing sea-level rise and is not cost effective compared to incremental adaptation.
Global sea-level policy as currently practiced by governments is therefore scientifically uncertain
and both financially and politically unsustainable.

The main effects of the Moon on the Earth are the tides, caused by their gravitational attraction.
Based on its mass, the Sun's gravitational attraction to the Earth is more than 177 times greater than that of the Moon to the Earth,
but because the Sun is 390 times further from the Earth than is the Moon,
the Sun's tide-generating force is about half that of the Moon.
See The Tides ("RGO Leaflets", in ARVAL)
See Inconstant Moon (John Walker, Fourmilab)

If the Earth were a perfect sphere without large continents,
all areas on the planet would experience two equally proportioned high and low tides every lunar day.
The large continents on the planet, however, block the westward passage of the tidal bulges as the Earth rotates.
Unable to move freely around the globe, these tides establish complex patterns within each ocean basin
that often differ greatly from tidal patterns of adjacent ocean basins or other regions of the same ocean basin.

Three basic tidal patterns occur along the Earth's major shorelines.
In general, most areas have two high tides and two low tides each day.
When the two highs and the two lows are about the same height, the pattern is called a semi-daily or semidiurnal tide.
If the high and low tides differ in height, the pattern is called a mixed semidiurnal tide.

Some areas, such as the Gulf of Mexico, have only one high and one low tide each day.
This is called a diurnal tide.
The U.S. West Coast and the Caribbean Sea tend to have mixed semidiurnal tides,
whereas a semidiurnal pattern is more typical of the East Coast.

"The mean sea level trend is 2.39 millimeters/year with a 95% confidence interval of +/- 0.43 mm/yr
based on monthly mean sea level data from 1931 to 1981 which is equivalent to a change of 0.78 feet in 100 years."
(0.3 meters = 1 foot)

"The mean sea level trend is 2.33 millimeters/year with a 95% confidence interval of +/- 0.15 mm/yr
based on monthly mean sea level data from 1913 to 2014 which is equivalent to a change of 0.77 feet in 100 years."
(0.3 meters = 1 foot)

Last 4-decades of Global and Northern Hemisphere Accumulated Cyclone Energy: 24 month running sums.
Note that the year indicated represents the value of ACE through the previous 24-months for the Northern Hemisphere
(bottom line/gray boxes) and the entire global ACE (top line/blue boxes).
The area in between represents the Southern Hemisphere total ACE.
[The graphic above is from December 31, 2015]

The Accumulated Cyclone Energy (ACE) index is the measure of total seasonal tropical storm activity used by NOAA.
The ACE is a wind energy index, defined as the sum of the squares of the maximum sustained surface wind speed (knots)
measured every six hours for all named storms while they are at least of tropical storm strength.

On June 2011:
"Since 2006, Northern Hemisphere and global tropical cyclone ACE [Accumulated Cyclone Energy]
has decreased dramatically to the lowest levels since the late 1970s."
"During the past 6-years since Hurricane Katrina,
global tropical cyclone frequency and energy have decreased dramatically, and are currently at near-historical record lows."

"I think any good scientist ought to be a skeptic.", Freeman Dyson said.

"I just think they don't understand the climate," he said of climatologists. "Their computer models are full of fudge factors."

"The models are extremely oversimplified," he said.
"They don't represent the clouds in detail at all. They simply use a fudge factor to represent the clouds."

"It's certainly true that carbon dioxide is good for vegetation," Dyson said.
"About 15 percent of agricultural yields are due to CO2 we put in the atmosphere.
From that point of view, it's a real plus to burn coal and oil."

Statement of Prof. Zbigniew Jaworowski
Chairman, Scientific Council of Central Laboratory for Radiological Protection. Warsaw, Poland
for the Hearing before the US Senate Committee on Commerce, Science, and Transportation
March 19, 2004

"Determinations of CO2 in polar ice cores are commonly used for estimations of the pre-industrial CO2 atmospheric levels.
Perusal of these determinations convinced me
that glaciological studies are not able to provide a reliable reconstruction of CO2 concentrations in the ancient atmosphere.
This is because the ice cores do not fulfill the essential closed system criteria.
One of them is a lack of liquid water in ice,
which could dramatically change the chemical composition the air bubbles trapped between the ice crystals.
This criterion, is not met, as even the coldest Antarctic ice (down to -73°C) contains liquid water.
More than 20 physico-chemical processes, mostly related to the presence of liquid water,
contribute to the alteration of the original chemical composition of the air inclusions in polar ice."

"The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change
is the assumption of low level of CO2 in the pre-industrial atmosphere.
This assumption, based on glaciological studies, is false.
Therefore IPCC projections should not be used for national and global economic planning.
The climatically inefficient and economically disastrous Kyoto Protocol, based on IPCC projections,
was correctly defined by President George W. Bush as "fatally flawed".
This criticism was recently followed by the President of Russia Vladimir V. Putin.
I hope that their rational views might save the world
from enormous damage that could be induced by implementing recommendations based on distorted science."

"Though not a pollutant, it is nonetheless the case that carbon dioxide absorbs space-bound infrared radiation,
thereby increasing the energy available at Earth's surface for warming or increased evaporation (eg de Freitas, 2002).
Radiation theory thus accepted,
there remain four problems with turning an increase in atmospheric carbon dioxide into global warming alarmism.
First, the relationship between increasing carbon dioxide and increasing temperature is logarithmic,
which lessens the forcing effect of each successive increment of carbon dioxide.
Second, in increasing from perhaps 280 ppm in pre-industrial times to 380 ppm now,
carbon dioxide should already have produced 75 per cent of the theoretical warming of ~1°C
that would be caused by a doubling to 560 ppm (Lindzen, 2006);
as we move from 380 to 560 ppm, at most a trivial few tenths of a degree of warming remain in the system.
Claims of greater warming, such as those of the IPCC (2001),
are based upon arbitrary adjustments to the lambda value in the Stefan-Boltzmann equation,
and untested assumptions about positive feedbacks from water vapour.
Third, the ice core data show conclusively that, during natural climate cycling,
changes in temperature precede changes in carbon dioxide by an average 800 years or so
(Fischer et al, 1999; Indermuhle et al, 2000; Mudelsee, 2001; Caillon et al, 2003);
similarly, temperature change precedes carbon dioxide change, in this case by five months, during annual seasonal cycling
(Kuo, Lindberg and Thomson, 1990). And, fourth, Boucot, Xu and Scotese (2004)
have shown that over the Phanerozoic
little relationship exists between the atmospheric concentration of carbon dioxide and necessary warming,
including that extensive glaciation occurred between 444 and 353 million years ago
when atmospheric carbon dioxide was up to 17 times higher than today (Chumakov, 2004)."

"The current scientific reality is that the IPCC's hypothesis of dangerous global warming has been repeatedly tested, and fails.
Despite the expenditure of large sums of money over the last 25 years (more than $100 billion),
and great research effort by IPCC-related and other (independent) scientists,
to date no scientific study has established a certain link between changes in any significant environmental parameter
and human-caused carbon dioxide emissions."

"In contrast, the null hypothesis that the global climatic changes that we have observed over the last 150 years
(and continue to observe today) are natural in origin has yet to be disproven.
As summarized in the reports of the Nongovernmental International Panel on Climate Change
(NIPCC),
literally thousands of papers published in refereed journals contain facts or writings consistent with the null hypothesis,
and plausible natural explanations exist for all the post-1850 global climatic changes that have been described so far."

Professor Michael Mann plotted a graph in the late 1990s [1998] that showed global temperatures for the last 1,000 years.
It showed a sharp rise in temperature over the last 100 years as man-made CO2 emissions also increased,
creating the shape of a hockey stick and blurring the Medieval Warm Period.

The graph was used by Al Gore in his film 'An Inconvenient Truth' [2006]
and was prominently cited in 2001 by the United Nations body the Intergovernmental Panel on Climate Change (IPCC)
as evidence of the link between fossil fuel use and global warming.

But the graph was questioned by sceptics
who pointed out that is it impossible to know for certain the global temperature going back beyond modern times
because there were no accurate readings.

The issue became a central argument in the climate change debate and was dragged into the 'climategate' scandal,
as the sceptics accused Prof Mann and his supporters of exaggerating the extent of global warming.

However, speaking to the BBC recently, Prof Mann, a climatologist at Pennsylvania State University,
said he had always made clear there were "uncertainties" in his work.

"I always thought it was somewhat misplaced to make it a central icon of the climate change debate", he said.

"Shortly after its publication, the hockey stick and its main author, Michael Mann, came under attack from Steve McIntyre,
a retired statistician from Canada.
In a series of scientific papers and later on his blog, Climate Audit,
McIntyre took issue with the novel statistical procedures used by the hockey stick's authors.
He was able to demonstrate that the way they had extracted the temperature signal from the tree ring records
was biased so as to choose hockey-stick shaped graphs in preference to other shapes,
and criticised Mann for not publishing the cross validation R2,
a statistical measure of how well the temperature reconstruction correlated with actual temperature records.
He also showed that the appearance of the graph
was due solely to the use of an estimate of historic temperatures based on tree rings from bristlecone pines,
a species that was known to be problematic for this kind of reconstruction."

"We find that the proxies do not predict temperature significantly better than random series generated independently of temperature.
Furthermore,
various model specifications that perform similarly at predicting temperature produce extremely different historical backcasts."

"In sum, these results suggest that the ninety-three sequences that comprise the 1,000 year old proxy record
simply lack power to detect a sharp increase in temperature."

Climate History - Earth History:

600 Million Years of Temperature and Carbon Dioxide:

600 Million Years of Temperature and Carbon Dioxide
A slide from a presentation by Dr. David Archibald in Melbourne on February 5th 2011
Left vertical axis: Atmospheric Carbon Dioxide Concentration in parts per million in volume
Right vertical axis: Global Temperature Anomaly in °C
Over the last 150 million years, geological processes have taken 90% of the carbon dioxide out of the atmosphere.

Is carbon dioxide linked to global warming?
While the carbon dioxide heating effect is real and related to warming, it is minuscule and logarithmic.

The logical reason for temperature increase is the Sun.
Predicted solar activity can be used to predict the climate,
and the current prediction is for a 24-year cold period similar to that experienced at the beginning of the 19th century.

"Generally quoted uncertainty figures from studies going back to the start of the Holocene
have temperature uncertainties in the range of ±3.0°C.
Even in more recent time frames,
data quoted by the IPCC show temperature uncertainties that exceed the measured temperature increases for the last century.
In fact, the IPCC's projected increase falls within the uncertainty range of the data they based their predictions on."

"From studies of Antarctic ice cores going back half a million years, the average CO2 to temperature lag is 1,300±1,000 years.
Samples taken from around the end of the last glacial period indicate
that the CO2 levels did not begin to rise until after the warming began."

"In results published in Science,
a high-resolution deuterium profile is now available for the entire EPICA Dome C ice core.
This profile allowed the construction of a climate record that extends back to 800,000 years before the present.
The ice core has provided temperature data covering 11 glacial, and corresponding interglacial, periods.
The authors used an atmospheric global climate model (GCM) to calculate an improved temperature record for the entire interval,
finding temperatures during warm intervals as much as 8°F (4.5°C) warmer,
and, during cold intervals, as much as 18°F (10°C) lower, than preanthropogenic Holocene values."

Note that the Earth is currently (on the right in the graphic) at an Interglacial period that started more than 10,000 years ago;
The next step, on a geological scale, is one of cooling.
This will mean an augmented need for food, energy and economic resources, especially for countries at medium to high latitudes,
both North and South of the Equator.
Countries at tropical latitudes will have to depend on their own food resources, grown locally, to survive.

In January 1998, the collaborative ice-drilling project between Russia,
the United States, and France at the Russian Vostok station in East Antarctica yielded the deepest ice core ever recovered,
reaching a depth of 3,623 m.
Preliminary data indicate the Vostok ice-core record extends through four climate cycles,
with ice slightly older than 400 kyr.
Isotopes of Hydrogen and Oxygen have been used to develop Earth temperature histories extending over 400,000 years.
Trapped gas bubbles record the history of atmospheric CO2 concentrations for over 400,000 years.

From Vostok Ice Core
(NOAA Paleoclimatology, with a link to the Vostok data)

By careful analysis of this historic ice core, they reconstructed trends of many climatic and environmental parameters,
including temperature and CO2 concentration, over a period of 420,000 years.

Over four glacial-interglacial cycles, the succession of changes through each cycle of glacial growth and termination was similar,
with atmospheric and climatic properties oscillating between fairly stable lower and upper bounds.
Surface temperature, for example, varied over a range of approximately 12°C,
while atmospheric CO2 concentration ranged from a low of 180 ppm to a high of 290 ppm.

The authors note that
"the new data confirm that the warmest temperature at stage 7.5 [238,000 years ago]
was slightly warmer than the Holocene [the current interglacial]."
They also note that the interglacials preceding and following the one at 238,000 years ago were warmer still.
In fact, from the graphs they present,
it can be seen that all of the four interglacials that preceded the Holocene were warmer than the current one,
and by an average temperature in excess of 2°C.

"The Holocene, which has already lasted 11,000 years, is, by far,
the longest stable warm period recorded in Antarctica during the past 420,000 years".

Temperature fluctuations over the past 17,000 years showing the abrupt cooling during the Younger Dryas.
The late Pleistocene cold glacial climate that built immense ice sheets terminated suddenly about 14,500 years ago [12,500 BC],
causing glaciers to melt dramatically.
About 12,800 years ago [10,800 BC], after about 2,000 years of fluctuating climate,
temperatures plunged suddenly and remained cool for 1,300 years.
About 11,500 years ago [9,500 BC], the climate again warmed suddenly and the Younger Dryas ended.
Also showing the Holocene Warm Period,
the Roman Warm Period and the Medieval Warm Period compared to today's small rise in average temperature.
Graphic by Don J. Easterbrook.

This graph shows the average of 18 non-tree ring proxies of temperature from 12 locations around the Northern Hemisphere,
published by Craig Loehle in 2007, and later
revised in 2008.
It clearly shows that natural climate variability happens, and these proxies coincide with known events in human history.

[It shows the Medieval Warm Period, the arrival and the end of the Viking colonization in Greenland, and the Little Ice Age]

Loehle also published in 2008 a paper that described why tree rings can not be trusted
[Craig Loehle on the Divergence Problem]
as a proxy for past temperature variations.
Tree ring data have what is called a "divergence problem" in the late 20th Century where the tree ring data suggests cooling,
when in fact there has been warming.
This, by itself, should cast serious doubt on whether tree ring reconstructions
(such as Michael Mann's famous "hockey stick" curve) can be used to estimate past global temperature variability.

The Earth consolidated some 4,500 million years ago. About 1,100 million years ago, the supercontinent of Rodinia was assembled.
Rodinia split into 2 halves approximately 750 million years ago.
The global climate was cold during the Late Precambrian, some 650 million years ago.
The most popular hypothesis suggests that the Earth was completely frozen - oceans and all - like a giant snowball.

During the last 2 billion years the Earth's climate has alternated between a frigid "Ice House", like today's world,
and a steaming "Hot House", like the world of the dinosaurs.

Paleoclimate studies indicate that in the past billion years the Earth's absolute global mean surface temperature
has not varied by more than 3% (~8 K = ~8°C) either side of the 750-million-year mean (291 K = 18°C).

The Earth has been in an Ice House Climate for the last 30 million years.

When the Earth is in its "Ice House" climate mode, there is ice at the poles.
The polar ice sheet expands and contracts because of variations in the Earth's orbit (Milankovitch cycles).
The last expansion of the polar ice sheets took place about 18,000 years ago.

For the last 5 million years the Earth has been in a major Ice Age.
There have been only a few times in Earth's history when it has been as cold as it has been during the last 5 million years.

The climate during the Miocene was similar to today's climate, but warmer.
Well-defined climatic belts stretched from Pole to Equator, however, there were palm trees and alligators in England and Northern Europe.
Australia was less arid than it is now.

See also We Are Living In Cold Times
(Dr. Jørgen Peder Steffensen. Centre for Ice and Climate, University of Copenhagen. Video 04:10 ClimateClips.com).

Limits on CO2 Climate Forcing from Recent Temperature Data of Earth:

"The global atmospheric temperature anomalies of Earth reached a maximum in 1998 which has not been exceeded during the subsequent 10 years.
The global anomalies are calculated from the average of climate effects occurring in the tropical and the extratropical latitude bands.
El Niño/La Niña effects in the tropical band are shown to explain the 1998 maximum
while variations in the background of the global anomalies largely come from climate effects in the northern extratropics.
These effects do not have the signature associated with CO2 climate forcing.
However, the data show a small underlying positive trend that is consistent with CO2 climate forcing with no-feedback."

"We examine tropospheric temperature trends of 67 runs from 22 'Climate of the 20th Century' model simulations
and try to reconcile them with the best available updated observations (in the tropics during the satellite era).
Model results and observed temperature trends are in disagreement in most of the tropical troposphere,
being separated by more than twice the uncertainty of the model mean.
In layers near 5 km, the modelled trend is 100 to 300% higher than observed,
and, above 8 km, modelled and observed trends have opposite signs.
These conclusions contrast strongly with those of recent publications based on essentially the same data."

Global Warming Advocacy Science: A Cross Examination
Dr. Jason Scott Johnston, Professor of Law and Coordinator, University of Pennsylvania Law School

"Legal scholarship has come to accept as true the various pronouncements of the Intergovernmental Panel on Climate Change (IPCC)
and other scientists who have been active in the movement for greenhouse gas (ghg) emission reductions to combat global warming.
The only criticism that legal scholars have had of the story told by this group of activist scientists
- what may be called the climate establishment
- is that it is too conservative in not paying enough attention to possible catastrophic harm
from potentially very high temperature increases."

"This paper departs from such faith in the climate establishment
by comparing the picture of climate science presented by the Intergovernmental Panel on Climate Change (IPCC)
and other global warming scientist advocates with the peer-edited scientific literature on climate change.
A review of the peer-edited literature reveals a systematic tendency of the climate establishment to engage
in a variety of stylized rhetorical techniques that seem to oversell what is actually known about climate change
while concealing fundamental uncertainties and open questions regarding many of the key processes involved in climate change.
Fundamental open questions include not only the size but the direction of feedback effects
that are responsible for the bulk of the temperature increase predicted to result from atmospheric greenhouse gas increases:
while climate models all presume that such feedback effects are on balance strongly positive,
more and more peer-edited scientific papers seem to suggest that feedback effects may be small or even negative.
The cross-examination conducted in this paper reveals many additional areas where the peer-edited literature
seems to conflict with the picture painted by establishment climate science,
ranging from the magnitude of 20th century surface temperature increases and their relation to past temperatures;
the possibility that inherent variability in the earth's non-linear climate system,
and not increases in CO2, may explain observed late 20th century warming;
the ability of climate models to actually explain past temperatures;
and, finally, substantial doubt about the methodological validity
of models used to make highly publicized predictions of global warming impacts such as species loss."

"Insofar as establishment climate science has glossed over and minimized such fundamental questions and uncertainties in climate science,
it has created widespread misimpressions that have serious consequences for optimal policy design.
Such misimpressions uniformly tend to support the case for rapid and costly decarbonization of the American economy,
yet they characterize the work of even the most rigorous legal scholars.
A more balanced and nuanced view of the existing state of climate science
supports much more gradual and easily reversible policies regarding greenhouse gas emission reduction,
and also urges a redirection in public funding of climate science
away from the continued subsidization of refinements of computer models
and toward increased spending on the development of standardized observational datasets
against which existing climate models can be tested."

See Global Warming Advocacy Science: A Cross Examination
(Dr. Jason Scott Johnston, Professor of Law and Coordinator, Program on Law and the Environment,
University of Pennsylvania - Law School. Institute for Law and Economic Research, Paper No. 10-08, May 1 '10)

This moment is a turning point in the climate change debate.
Not because the
report released Monday
addresses every concern raised by critics of the Intergovernmental Panel on Climate Change (IPCC),
but because it knocks the IPCC off its pedestal.

Those who challenge the IPCC's authority are often ignored.
Numerous science academies have blessed its efforts, so who are we to question?
This week those academies began to act like grownups in relation to this wayward child.
The report, authored by a committee assembled by the
InterAcademy Council
(a collection of science bodies from around the world),
blows smoking holes through just about everything the IPCC's chairman, Rajendra Pachauri, has been telling us.
[123-page report PDF, updated Feb. 2, 2011]

TRANSPARENCY
He boasts that his organization carries out its work with "complete transparency".
But this report says transparency is in short supply.
Some stages of the IPCC process, it finds,
"are poorly understood, even to many scientists and government representatives who participate in the process".

The report says the IPCC has never established any formal criteria for selecting its most senior personnel,
its lead authors, or other key participants.
Nor are there any guidelines about what scientific and technical information the IPCC should consider
when it carries out its literature review.
How these decisions have been made for the past two decades is, therefore,
anyone's guess - a situation rather opposite to complete transparency.

The report says a preliminary outline is drawn up by a select group of individuals at the beginning of the IPCC process,
but how this happens - and who participates - is a mystery to those who aren't invited.
Nor does anything in the following sentence provide comfort to IPCC partisans:

The absence of a transparent author selection process or well-defined criteria for author selection
can raise questions of bias and undermine the confidence of scientists and others in the credibility of the assessment...

PEER-REVIEWED LITERATURE
In February 2008 Pachauri declared to a committee of the North Carolina legislature
(as he has in many other contexts before and since), that:

...we carry out an assessment of climate change based on peer-reviewed literature,
so everything that we look at and take into account in our assessments has to carry [the] credibility of peer-reviewed publications,
we don't settle for anything less than that. [bold added]

But the InterAcademy report matter-of-factly tells the world
that an analysis of the IPCC's third assessment report found only 84% of the source material cited by Working Group 1 was peer-reviewed,
only 59% cited by Working Group 2 was, and only 36% cited by Working Group 3 met this standard.
(An analysis of the IPCC's fourth assessment report references, organized by yours truly,
produced similar results.)

Procedures regarding the use of non-peer-reviewed literature are in place,
but the report says "it is clear that these procedures are not always followed".
The rules say non-peer-reviewed sources are supposed to be identified as such when listed among the IPCC's references.
Yet the InterAcademy report says it "found few instances of information flagged" in this manner. As in almost none.
According to my colleague,
Hilary Ostrov,
only 6 of the 5,587 non-peer-reviewed references in the 2007 IPCC report were properly identified.

In a nutshell, the IPCC doesn't follow its own procedures.
Or, in the more diplomatic phrasing of the report: "stronger mechanisms for enforcing [these procedures] are needed."

ROBUST PROCESSES
Pachauri also told the North Carolina lawmakers that the IPCC's "writing and review process is very robust, very vigorous".
Yet the InterAcademy report confirms that,
no matter how loudly the IPCC's expert reviewers and each chapter's review editors might protest,
the lead authors "have the final say on the content of their chapter".
In other words, the IPCC's vaunted review process amounts to window-dressing.

The InterAcademy committee observes that the IPCC's embarrassing Himalayan glacier error
could have been avoided had it merely listened to its own expert reviewers.
The mistake was noticed, but the IPCC "did not change the text".

In that instance alone, the IPCC system failed in three ways.
First, the IPCC authors chose to rely on an unsubstantiated claim in a non-peer-reviewed document.
Then these authors failed to take seriously the feedback from the IPCC's expert reviewers
- who pointed out that peer-reviewed material contained more cautious and equivocal conclusions.
Finally, the review editors for that chapter failed to ensure that the expert feedback was properly addressed.

Another area of concern relates to the fact that, despite the highly contested nature of the climate debate,
and that billions in expenditures around the world are profoundly influenced by the IPCC's findings,
this organization has no conflict-of-interest policy.

"The IPCC was established by politicians, its experts are selected by politicians, and its conclusions are negotiated by politicians."

From Chapter 17 - Cross-Examination:

"Much to [Dr. Jason Scott] Johnston's surprise, his own research discovered that,
"on virtually every major issue in climate change science",
IPCC reports "systematically conceal or minimize what appear to be fundamental scientific uncertainties"."

"For example, he devotes dozens of pages to explaining the shortcomings of climate models.
According to these models, increased CO2 will cause the air near the surface of the planet to heat up.
This effect is supposed to be especially pronounced in the atmosphere nearest the equator.
Johnston says this second point gives us an opportunity to empirically test whether the models get it right.
Buried within the pages of the crucial attribution chapter of the 2007 Climate Bible, the IPCC acknowledges there's a problem.
It admits (in none-too-clear language) that the extra heat isn't where the models say it should be.
The real world isn't behaving the way the models predict it will.
Johnston observes that this leaves two possibilities:
Either the real-world data is faulty "or something is wrong with the models".
Guess which explanation the Climate Bible chooses?
The authors of that chapter say the "probable explanation" is that
real temperature data gathered in the real world is "contaminated by errors".
While the IPCC may be content with a probable explanation,
the public surely deserves to be told that the climate models fail this important test.
But as Johnston points out, this fact isn't even mentioned in the Summary for Policymakers document for that section of the Climate Bible."

"Subtracting temperature trends at the surface from those in the free atmosphere
removes much of the common variability between these layers
and tests whether the model-predicted trends in tropospheric lapse rate are consistent
with those observed by radiosondes and satellites (Karl et al., 2006).
Since 1979, globally averaged modelled trends in tropospheric lapse rates are consistent with those observed.
However, this is not the case in the tropics,
where most models have more warming aloft than at the surface
while most observational estimates show more warming at the surface than in the troposphere (Karl et al., 2006)".

Have Atmospheric CO2 Increases Been Responsible for the Recent Large Upswing (since 1995)
in Atlantic Basin Major Hurricanes?

"The U.S. landfall of major hurricanes Dennis, Katrina, Rita and Wilma in 2005
and the four Southeast landfalling hurricanes of 2004 - Charley, Frances, Ivan and Jeanne,
raised questions about the possible role that global warming played in those two unusually destructive seasons.
In addition, three category 2 hurricanes (Dolly, Gustav and Ike)
pummeled the Gulf Coast in 2008 causing considerable devastation.
Some researchers have tried to link the rising CO2 levels
with SST [Sea Surface Temperatures] increases during the late 20th century
and say that this has brought on higher levels of hurricane intensity."

"These speculations that hurricane intensity has increased have been given much media attention;
however, we believe that they are not valid, given current observational data."

"There has, however, been a large increase in Atlantic basin major hurricane activity since 1995
in comparison with the prior 15-year period of 1980-1994 (22 major hurricanes)
and the prior quarter-century period of 1970-1994 (38 major hurricanes).
It has been tempting for many who do not have a strong background in hurricane knowledge
to jump on this recent 15-year increase in major hurricane activity as strong evidence of a human influence on hurricanes.
It should be noted, however, that the last 15-year active major hurricane period of 1995-2009 (56 major hurricanes) has,
however, not been more active than the earlier 15-year period of 1950-1964 (57 major hurricanes)
when the Atlantic Ocean circulation conditions were similar to what has been observed in the last 15 years.
These conditions occurred even though atmospheric CO2 amounts were lower in the earlier period."

"Although global surface temperatures increased during the late 20th century,
there is no reliable data to indicate increased hurricane frequency or intensity
in any of the globe's other tropical cyclone basins since 1979.
Global Accumulated Cyclone Energy (ACE) shows significant year-to-year and decadal variability
over the past thirty years but no increasing trend.
Similarly, Klotzbach (2006) found no significant change in global TC activity during the period from 1986-2005."

After some prolonged deliberation,
I have decided to withdraw from participating in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC).
I am withdrawing because I have come to view the part of the IPCC to which my expertise is relevant as having become politicized.
In addition, when I have raised my concerns to the IPCC leadership, their response was simply to dismiss my concerns."

....

"All previous and current research in the area of hurricane variability has shown no reliable,
long-term trend up in the frequency or intensity of tropical cyclones, either in the Atlantic or any other basin."

"Moreover, the evidence is quite strong and supported by the most recent credible studies
that any impact in the future from global warming upon hurricane will likely be quite small."

....

"It is beyond me why my colleagues would utilize the media to push an unsupported agenda
that recent hurricane activity has been due to global warming.
Given Dr. Trenberth's role as the IPCC's Lead Author responsible for preparing the text on hurricanes,
his public statements so far outside of current scientific understanding
led me to concern that it would be very difficult for the IPCC process to proceed objectively
with regards to the assessment on hurricane activity."

....

Sincerely, Chris Landsea

Tornadoes:

Severe Weather 101: Tornadoes:

"A tornado is a narrow, violently rotating column of air that extends from the base of a thunderstorm to the ground.
Because wind is invisible, it is hard to see a tornado unless it forms a condensation funnel made up of water droplets, dust and debris.
Tornadoes are the most violent of all atmospheric storms."

"One of the main difficulties with tornado records is that a tornado, or evidence of a tornado, must have been observed.
Unlike rainfall or temperature, which may be measured by a fixed instrument, tornadoes are short-lived and very unpredictable.
If a tornado occurs in a place with few or no people, it is not likely to be documented.
Many significant tornadoes may not make it into the historical record
since Tornado Alley was very sparsely populated during the 20th century."

"Because a tornado is part of a severe convective storm, and these storms occur all over the Earth,
tornadoes are not limited to any specific geographic location.
In fact, tornadoes have been documented in every state of the United States,
and on every continent, with the exception of Antarctica (even there, a tornado occurrence is not impossible).
In fact, wherever the atmospheric conditions are exactly right, the occurrence of a tornadic storm is possible."

"However, some parts of the world are much more prone to tornadoes than others.
Globally, the middle latitudes, between about 30° and 50° North or South, provide the most favorable environment for tornadogenesis.
This is the region where cold, polar air meets against warmer, subtropical air,
often generating convective precipitation along the collision boundaries.
In addition, air in the mid-latitudes often flows at different speeds and directions at different levels of the troposphere,
facilitating the development of rotation within a storm cell.
Interestingly, the places that receive the most frequent tornadoes are also considered the most fertile agricultural zones of the world.
This is due in part to the high number of convective storms delivering needed precipitation to these areas.
Simply as a result of the large number of convective storms and the favorable environment,
the odds are increased that some of these storms will produce tornadoes."

"In the United States, there are two regions with a disproportionately high frequency of tornadoes.
Florida is one and "Tornado Alley" in the south-central U.S. is the other.
Florida has numerous tornadoes simply due to the high frequency of almost daily thunderstorms.
In addition, several tropical storms or hurricanes often impact the Florida peninsula each year.
When these tropical systems move ashore, the embedded convective storms in the rain bands often produce tornadoes.
However, despite the violent nature of a tropical storm or hurricane,
the tornadoes they spawn (some as water spouts) tend to be weaker than those produced by non-tropical thunderstorms."

"In addition, tornadoes occur throughout the year.
Because a tornado may occur at any time of the day or year somewhere in the U.S.,
there really is no national tornado "season" (as there is with Atlantic hurricanes).
Instead, each region may experience increased tornadic potential at different times of the year."

"With increased national Doppler radar coverage, increasing population, and greater attention to tornado reporting,
there has been an increase in the number of tornado reports over the past several decades.
This can create a misleading appearance of an increasing trend in tornado frequency."
"There has been little trend in the frequency of the stronger tornadoes over the past 55 years."

"Because most tornadoes are related to the strength of a thunderstorm,
and thunderstorms normally gain most of their energy from solar heating and latent heat released by the condensation of water vapor,
it is not surprising that most tornadoes occur in the afternoon and evening hours, with a minimum frequency around dawn."

Annual totals of U.S. tornadoes from NWS Local Storm Reports:2011: 1,897.
2012: 1,116.
2013: 943.
2014: 1,055.
2015: 1,257.
U.S. Annual Averages (till 2014): Last 30 years: 1,141. Last 20 years: 1,239. Last 10 years 1,201.
Annual Average Number of Tornadoes by State:
Florida: Last 30 years average: 59. Last 20 years average: 60. Last 10 years average: 41.

"In early November 1978 a microwave sensor aboard the National Oceanic and Atmospheric Administration's TIROS polar-orbiting satellite
started scanning the Earth's atmosphere."

"The 1997-1998 "El Niño of the century" made 1998 the hottest calendar year during the 28+ year record,
with an annual average temperature that was 0.47°C (0.85°F) warmer than normal."

"There is no scientific evidence to support the belief that Earth's climate is stable
and will not change if human activity does not intervene."

"While the approximately 0.14°C per decade of global warming seen in the satellite data
is minor compared to the scale of some past climate shifts, it reminds us that the natural processes of climate change have not stopped."

"Looking at the history of paleoclimate data indicates that the climate is capable of significant changes
for reasons that are not understood."

"The current level of knowledge about the climate
doesn't provide the tools needed to predict when rapid natural climate changes will occur and what forms it might take.
This makes it impossible to say with high confidence how much human factors might influence climate change."

"The first thing is to do no harm.
With the threat of catastrophic climate change, many proposals have been put forward to limit energy use."

"A fundamental point that needs to be understood
is that if any of these proposals (including the Kyoto protocol) are implemented,
they will have an effect on the climate so small that it cannot be detected.
None of these proposals will change what the climate is doing enough to notice."

"Those are good reasons not to artificially force energy prices up.
While raising energy costs might damage the economy, it would disproportionately hurt the poor,
especially those people living on the world's social and economic fringes."

"While the extent of human impacts on global climate change remains uncertain,
research by our colleagues at UAH confirms that deforestation and land conversion
are changing regional weather patterns and the local climate over some parts of the world.
We should encourage and support the scientists and engineers who will develop new sources of low-cost energy."

"Ironically, actions that artificially inflate the cost of energy might hamper those efforts,
as healthy economies can better afford to find and develop alternative energy sources and cleaner energy technologies."

"In science, there is an art to simplifying complex problems so that they can be meaningfully analyzed.
If one oversimplifies, the analysis is meaningless.
If one doesn't simplify, then one often cannot proceed with the analysis.
When it comes to global warming due to the greenhouse effect,
it is clear that many approaches are highly oversimplified."

"Using basic theory, modeling results and observations, we can reasonably bound
the anthropogenic contributions to surface warming since 1979 to a third of the observed warming,
leading to a climate sensitivity too small to offer any significant measure of alarm -
assuming current observed surface and tropospheric trends and model depictions of greenhouse warming are correct."

"We next showed that the defense of the attribution of recent warming to man involves an observed warming
that is smaller than expected, and where the attribution, itself,
depends on relatively subjective claims concerning the ability of current models
to accurately portray natural unforced climate variability.
Thus, the claim that models cannot account for recent warming without external forcing
is held to imply the role of human forcing.
To be sure, current models can simulate the recent trend in surface temperature,
but only by invoking largely unknown properties of aerosols and ocean delay
in order to cancel most of the greenhouse warming.
Finally, we note substantial corroborating work showing low climate sensitivity."

"Ultimately, however,
one must recognize how small the difference is
between the estimation that the anthropogenic contribution to recent surface warming is on the order of 1/3,
and the iconic claim that it is likely that the human contribution is more that 1/2.
Alarm, we see, actually demands much more that the iconic statement itself.
It requires that greenhouse warming actually be larger than what has been observed,
that about half of it be cancelled by essentially unknown aerosols, and that the aerosols soon disappear.
Alarm does not stem directly from the iconic claim, but rather from the uncertainty in the claim,
which lumps together greenhouse gas additions and the cancelling aerosol contributions
(assuming that they indeed cancel warming),
and suggests that the sum is responsible for more than half of the observed surface warming.
What this paper attempts to do is point the way to a simple,
physically sound approach to reducing uncertainty and establishing estimates of climate sensitivity
that are focused and testable.
Such an approach would seem to be more comfortable for science
than the current emphasis on models testing models, large ranges of persistent uncertainty,
and reliance on alleged consensus.
Hopefully, this paper has also clarified
why significant doubt persists concerning the remarkably politicized issue of global warming alarm."

Man-made global warming has not been scientifically proven,
while significant reasons for considering this hypothesis as incorrect have been presented:

The Polish Academy of Sciences:

"The Earth's climate has predominantly been warmer than at present.
However, there has been some significant cooling that resulted in the development of extensive glaciations,
in some of which ice sheets even reached the tropics.
Therefore, any reliable forecasts of climate change, before discussion of prevention or neutralization,
should take into account evidence from the geological past when, obviously,
neither humans nor industry affected the Earth."

"During the last 400 thousand years - still without anthropogenic greenhouse influence -
the content of carbon dioxide in the air, as indicated by ice cores from Antarctica,
was repeatedly 4 times at similar or even slightly higher level than at present."

"In the past millennium, after warm medieval ages, by the end of the 13th century
a cold period started and lasted up to the middle of the 19th century,
then gave pace to another warm period in which we are living now.
The phenomena observed today, specifically a temporary rise of global temperature,
just reflect a natural rhythm of climate change."

"Instrumental monitoring of climate parameters has been carried out for only slightly more than 200 years
and exclusively on some parts of the continents that constitute a small part of the Earth.
Several older measurement stations once set up in suburbs now appear, due to progressive urbanization,
in the town centers which results among other effects in increased values of the measured temperatures.
Profound examination of the oceans was initiated 40 years ago.
Reliable climatic models must not be based on such a short measurement data base.
Therefore, considerable restraint is desirable
if ascribing exclusive or predominant responsibility to man for increased emission of greenhouse gases.
The reality of such arbitrary statement on human influence has not been demonstrated."

"Research experience in the Earth sciences suggests that simple explanations of natural phenomena,
based on partial observations only and without consideration of numerous factors important for individual processes in a geosystem,
lead generally to unreasonable simplification and misleading conclusions."

E-mails leaked out of the Climatic Research Unit (CRU) (University of East Anglia, UK)
on November 17, 2009, show scientists colluding to distort data to favor the man-made global warming hypothesis
and suppress opinion and scientific works opposing it.
Scientists from the Climatic Research Unit (CRU) at the University of East Anglia are leading authors and contributors of the
IPCC Assessment Reports on Climate Change
(Intergovernmental Panel on Climate Change, UNEP).

These distorted data are the "physical" basis for "Global Warming" and "Climate Change".

"The only reasonable explanation for the archive being in this state is that the FOI Officer at the University was practising due diligence.
The UEA was collecting data that couldn't be sheltered and they created FOIA2009.zip."
[FOI = Freedom Of Information]

"It is most likely that the FOI Officer at the University put it on an anonymous ftp server
or that it resided on a shared folder that many people had access to and some curious individual looked at it."

"Occam's razor
concludes that "the simplest explanation or strategy tends to be the best one".
The simplest explanation in this case is that someone at UEA found it and released it to the wild
and the release of FOIA2009.zip wasn't because of some hacker, but because of a leak from UEA by a person with scruples."

"This is one of the darkest periods in the history of science.
Those who love science, and all it stands for, will be pained by what they read below.
However, the crisis is here, and cannot be avoided."

From
The Climategate Emails
(Edited and annotated by John P. Costella, Ph.D. The Lavoisier Group, March 2010, .pdf)

Is the science concerning the current concerns about climate change sound?
Many people, starting with the members of the UK House of Commons Science and Technology Committee,
had hoped this question would be answered during the inquiry process,
and there is a frequent refrain in the media that the investigations affirmed the science.
But the reality is that none of the inquiries actually investigated the science.

"Early this morning, history repeated itself.
FOIA.org has produced an enormous zip file of 5,000 additional emails
similar to those released two years ago in November 2009 and coined 'Climategate'.
There are almost 1/4 million additional emails locked behind a password, which the organization does not plan on releasing at this time."

"This website is provided as a research resource for mining the recently leaked climate communications.
Every effort has been made to redact personal contact information such as email addresses and telephone numbers.
The redaction algorithms are currently tuned to be quite stringent, and they will inadvertently obfuscate other details as well.
We will continue to tune the software to improve the quality of the results."

"This database was assembled in a very short space of time,
and at present only provides the most rudimentary tools for exploring this vast trove of material.
We will be improving the quality of the search tools and adding further metadata to the database over the course of the next few weeks."

"This is a searchable service of both ClimateGate I and II emails.
All full emails, telephone numbers and passwords have been redacted (replaced with ???).
Note: you can still search by them if you know them, they just won't show in the results."

"If you're wondering why this is on an Eco site
it's because we are interested in fact led research and development that leads to a better future for all;
ClimateGate is very indicative that at the very core of climate research
the high standards that we all expected for such core research are not being upheld."

"Behind the scenes, I've been playing with a new neat tool for hunting hypocrisy, corruption, bias and unprofessional behaviour
and I'm pleased to announce it's ready to share with the world.
The kudos for this all belongs to, as usual, a skilled volunteer. Thanks to EcoGuy for turning his rapid-fire coding ability onto this."

"On the EcoWho site he has helpfully placed all of Climategate I and II together into a combined searchable database.
It's fast, easy to scan, it copes with tricky search requests and provides a link to the full email from the results page of the search."

Richard S. Lindzen
Program in Atmospheres, Oceans, and Climate
Massachusetts Institute of Technology

Seminar at the House of Commons Committee Rooms
Westminster, London
22nd February 2012

Stated briefly, I will simply try to clarify what the debate over climate change is really about.
It most certainly is not about whether climate is changing: it always is.
It is not about whether CO2 is increasing: it clearly is.
It is not about whether the increase in CO2, by itself, will lead to some warming: it should.
The debate is simply over the matter of how much warming the increase in CO2 can lead to,
and the connection of such warming to the innumerable claimed catastrophes.
The evidence is that the increase in CO2 will lead to very little warming,
and that the connection of this minimal warming (or even significant warming) to the purported catastrophes is also minimal.
The arguments on which the catastrophic claims are made are extremely weak - and commonly acknowledged as such.
They are sometimes overtly dishonest.

Here are two statements that are completely agreed on by the IPCC. It is crucial to be aware of their implications.

1. A doubling of CO2, by itself, contributes only about 1°C to greenhouse warming.
All models project more warming, because, within models, there are positive feedbacks from water vapor and clouds,
and these feedbacks are considered by the IPCC to be uncertain.
2. If one assumes all warming over the past century is due to anthropogenic greenhouse forcing,
then the derived sensitivity of the climate to a doubling of CO2 is less than 1°C.
The higher sensitivity of existing models is made consistent with observed warming
by invoking unknown additional negative forcings from aerosols and solar variability as arbitrary adjustments.

Given the above, the notion that alarming warming is 'settled science' should be offensive to any sentient individual,
though to be sure, the above is hardly emphasized by the IPCC.

Carbon Dioxide has been increasing

There is a greenhouse effect

There has been a doubling of equivalent CO2 over the past 150 years

Nothing on the left is controversial among serious climate scientists.

There has very probably been about 0.8°C warming in the past 150 years

Nothing on the left implies alarm. Indeed the actual warming is consistent with less than 1°C warming for a doubling.

Increasing CO2 alone should cause some warming (about 1°C for each doubling)

Unfortunately, denial of the facts on the left has made the public presentation of the science by those promoting alarm much easier.
They merely have to defend the trivially true points on the left;
declare that it is only a matter of well-known physics;
and relegate the real basis for alarm to a peripheral footnote
- even as they slyly acknowledge that this basis is subject to great uncertainty.

Quite apart from the science itself,
there are numerous reasons why an intelligent observer should be suspicious of the presentation of alarm.

1.The claim of 'incontrovertibility'. Science is never incontrovertible.
2. Arguing from 'authority' in lieu of scientific reasoning and data or even elementary logic.
3. Use of term 'global warming' without either definition or quantification.
4. Identification of complex phenomena with multiple causes with global warming and even as 'proof' of global warming.
5. Conflation of existence of climate change with anthropogenic climate change.

Perhaps we should stop accepting the term, 'skeptic'. Skepticism implies doubts about a plausible proposition.
Current global warming alarm hardly represents a plausible proposition.
Twenty years of repetition and escalation of claims does not make it more plausible.
Quite the contrary, the failure to improve the case over 20 years makes the case even less plausible
as does the evidence from climategate and other instances of overt cheating.

In the meantime, while I avoid making forecasts for tenths of a degree change in globally averaged temperature anomaly,
I am quite willing to state that unprecedented climate catastrophes are not on the horizon
though in several thousand years we may return to an ice age.

27 Feb. 2012, Dr. David M.W. Evans
Mathematician and engineer, with six university degrees including a PhD from Stanford University in electrical engineering.

Who Are You Going To Believe - The Government Climate Scientists or The Data?

We check the main predictions of the climate models against the best and latest data.
Fortunately the climate models got all their major predictions wrong.
Why? Every serious skeptical scientist has been consistently saying essentially the same thing for over 20 years,
yet most people have never heard the message - here it is, put simply enough for any lay reader willing to pay attention.

What the Government Climate Scientists Say

If the CO2 level doubles (as it is on course to do by about 2070 to 2100),
the climate models estimate the temperature increase due to that extra CO2 will be about 1.1°C × 3 = 3.3°C.

The direct effect of CO2 is well-established physics, based on laboratory results, and known for over a century.

Feedbacks are due to the ways the Earth reacts to the direct warming effect of the CO2.
The threefold amplification by feedbacks is based on the assumption, or guess, made around 1980,
that more warming due to CO2 will cause more evaporation from the oceans
and that this extra water vapor will in turn lead to even more heat trapping because water vapor is the main greenhouse gas.
And extra heat will cause even more evaporation, and so on.
This amplification is built into all the climate models.
The amount of amplification is estimated by assuming that nearly all the industrial-age warming is due to our CO2.

The government climate scientists and the media often tell us about the direct effect of the CO2,
but rarely admit that two thirds of their projected temperature increases are due to amplification by feedbacks.

What the Skeptics Say

If the CO2 level doubles, skeptics estimates that the temperature increase due to that extra CO2
will be about 1.1°C x 0.5 = 0.6°C.

The serious skeptical scientists have always agreed with the government climate scientists about the direct effect of CO2.
The argument is entirely about the feedbacks.

The feedbacks dampen or reduce the direct effect of the extra CO2, cutting it roughly in half.
The main feedbacks involve evaporation, water vapor, and clouds.
In particular, water vapor condenses into clouds,
so extra water vapor due to the direct warming effect of extra CO2 will cause extra clouds,
which reflect sunlight back out to space and cool the earth, thereby reducing the overall warming.

There are literally thousands of feedbacks, each of which either reinforces or opposes the direct warming effect of the extra CO2.
Almost every long-lived system is governed by net feedback that dampens its response to a perturbation.
If a system instead reacts to a perturbation by amplifying it,
the system is likely to reach a tipping point and become unstable
(like the electronic squeal that erupts when a microphone gets too close to its speakers).
The earth's climate is long-lived and stable - it has never gone into runaway greenhouse,
unlike Venus - which strongly suggests that the feedbacks dampen temperature perturbations such as that from extra CO2.

What the Data Says

The climate models have been essentially the same for 30 years now,
maintaining roughly the same sensitivity to extra CO2 even while they got more detailed with more computer power.

How well have the climate models predicted the temperature?

Does the data better support the climate models or the skeptic's view?

Air Temperatures

One of the earliest and most politically important predictions was presented to the US Congress in 1988 by Dr James Hansen,
the "father of global warming":

Hansen's predictions to the US Congress in 1988, compared to the subsequent temperatures as measured by NASA satellites.

Hansen's climate model clearly exaggerated future temperature rises.

In particular, his climate model predicted that if human CO2 emissions were cut back drastically starting in 1988,
such that by year 2000 the CO2 level was not rising at all, we would get his scenario C.
But in reality the temperature did not even rise this much, even though our CO2 emissions strongly increased
- which suggests that the climate models greatly overestimate the effect of CO2 emissions.

A more considered prediction by the climate models was made in 1990 in the IPCC's First Assessment Report:

Predictions of the IPCC's First Assessment Report in 1990, compared to the subsequent temperatures as measured by NASA satellites.

It's 20 years now, and the average rate of increase in reality is below the lowest trend in the range predicted by the IPCC.

Ocean Temperatures

The oceans hold the vast bulk of the heat in the climate system.
We've only been measuring ocean temperature properly since mid-2003, when the Argo system became operational.
In Argo, a buoy duck dives down to a depth of 2,000 meters, measures temperatures as it very slowly ascends,
then radios the results back to headquarters via satellite. Over three thousand Argo buoys constantly patrol all the oceans of the world.

Climate model predictions of ocean temperature, versus the measurements by Argo.
The unit of the vertical axis is 1022 Joules (about 0.01°C).

The ocean temperature has been basically flat since we started measuring it properly,
and not warming as quickly as the climate models predict.

Atmospheric Hotspot

The climate models predict a particular pattern of atmospheric warming during periods of global warming;
the most prominent change they predict is a warming in the tropics about 10 km up, the "hotspot".

The hotspot is the sign of the amplification in their theory.
The theory says the hotspot is caused by extra evaporation,
and by extra water vapor pushing the warmer wetter lower troposphere up into volume previously occupied by cool dry air.
The presence of a hotspot would indicate amplification is occurring, and vice versa.

We have been measuring atmospheric temperatures with weather balloons since the 1960s.
Millions of weather balloons have built up a good picture of atmospheric temperatures over the last few decades,
including the warming period from the late 70's to the late 90's.
This important and pivotal data was not released publicly by the climate establishment until 2006, and then in an obscure place.
Here it is:

On the left is the data collected by millions of weather balloons.
On the right is what the climate models say was happening.
The theory (as per the climate models) is incompatible with the observations.
In both diagrams the horizontal axis shows latitude, and the right vertical axis shows height in kilometers.

In reality there was no hotspot, not even a small one.
So in reality there is no amplification - the amplification does not exist.

Outgoing Radiation

The climate models predict that when the surface of the Earth warms,
less heat is radiated from the Earth into space (on a weekly or monthly time scale).
This is because, according to the theory, the warmer surface causes more evaporation and thus there is more heat-trapping water vapor.
This is the heat-trapping mechanism that is responsible for the assumed amplification.

Satellites have been measuring the radiation emitted from the Earth for the last two decades.
A major study has linked the changes in temperature on the earth's surface with the changes in the outgoing radiation.
Here are the results:

Outgoing radiation from earth against sea surface temperature, as measured by the ERBE satellites and as "predicted" by 11 climate models;
The slope of the graphs for the climate models are opposite to the slope of the graph for the observed data.

This shows that in reality the Earth gives off more heat when its surface is warmer.
This is the opposite of what the climate models predict.
This shows that the climate models trap heat too aggressively, and that their assumed amplification does not exist.

Conclusions

The air and ocean temperature data shows that the climate models overestimate temperature rises.
The climate establishment suggest that cooling due to undetected aerosols might be responsible for the failure of the models to date,
but this excuse is wearing thin - it continues not to warm as much as they said it would, or in the way they said it would.
On the other hand, the rise in air temperature has been greater than the skeptics say could be due to CO2.
The skeptic's excuse is that the rise is mainly due to other forces
- and they point out that the world has been in a fairly steady warming trend of 0.5°C per century since 1680
(with alternating ~30 year periods of warming and mild cooling)
where as the vast bulk of all human CO2 emissions have been after 1945.

The climate models get them all wrong.
The missing hotspot and outgoing radiation data both, independently, prove that the amplification in the climate models is not present.
Without the amplification, the climate model temperature predictions would be cut by at least two thirds,
which would explain why they overestimated the recent air and ocean temperature increases. Therefore:

The climate models are fundamentally flawed. Their assumed threefold amplification by feedbacks does not in fact exist.

The climate models overestimate temperature rises due to CO2 by at least a factor of three.

The skeptical view is compatible with the data.

This is an unusual political issue, because there is a right and a wrong answer and everyone will know which it is eventually.
People are going ahead and emitting CO2 anyway, so we are doing the experiment:
either the world heats up by several degrees by 2050, or it doesn't.

The complete article, with references, in .pdf format is at
The Skeptic's Case,
at
Science Speak
(a scientific modeling and mathematical research company, and we speak about some science and economic issues).

The era of chlorophyll dominance is referred to as the Great Oxidation.
This happened 2.5 billion years ago.
The ocean's dissolved iron rusted out [of the solution], producing our planet's iron ore deposits and releasing oxygen.
Chlorophyll is still the mechanism controlling the CO2 and O2 abundance.

All life forms basically originated by a photosynthesis process.
Chemically our hemoglobin and chlorophyll are quite similar, suggesting a common origin; this is supported by a common DNA code.

Whereas animals do not photosynthesize, their plant foods do.
Beef, chicken or fish feed off photosynthetic products.
It is mainly trace minerals that supplement [the] photo-source.

CO2 is literally the gas of life for all macro life forms we encounter.
The existence of extremophiles suggests very early non-solar energy sources.

Demonizing CO2 started with the plan for peaceful use of atomic energy.
The big dream in 1946 that was that atomic energy would be so cheap
that electricity would never again need to be metered.
The attribution of increased CO2 to fossil fuel burning was born then.

Atomic energy advocates wanted to save Earth from runaway Green House heating like [in] Venus.
A conservation ethic developed to conserve the finite petroleum for the future and
anti-pollution and anti-growth advocates added voices to the anti-CO2 theme.

All earthly macro life forms are photo-synthetically derived from CO2,
either directly or indirectly by chlorophyll that absorbs solar photons.
We are here not at the whim of a deity but by evolution of CO2 derivatives.

For the original and a discussion, see
The Gas of Life (Watts Up With That?, February 29, 2012)

Observed and Modeled Global Temperature Evolution, 1951-2013:

Figure 1. Observed global average temperature evolution, 1951-2013, as compiled by the U.K's Hadley Center (black line),
and the average temperature change projected by a collection of climate models used in the IPCC Fifth Assessment Report
which have a climate sensitivity greater than 3.0°C (red line) and a collection of models with climate sensitivities less than 3.0°C (blue line)
(climate model data source: KNMI Climate Explorer).

Hundreds of millions of dollars that have gone into the expensive climate modelling enterprise
has all but destroyed governmental funding of research into natural sources of climate change.
For years the modelers have maintained that there is no such thing as natural climate change... yet they now, ironically,
have to invoke natural climate forces to explain why surface warming has essentially stopped in the last 15 years!

Agreement in early years between climate models and observations led modelers to believe their assumed forcings (mostly C02)
and sensitivity were correct ...

... but the "pause" in warming now suggests they neglected sources of natural warming,
used a model sensitivity that was too high (to make up the difference),
and now the models are too sensitive and thus predict too much warming.

The "pause" in global warming is becoming increasingly difficult for the climate establishment to ignore, which is a good thing.
They are now coming up with reasons why there has been a "pause"
(a term I dislike because it implies knowledge of future warming, which no one has), and spinning it as if it is bad news for us.

But when they assume that natural climate variations can cause a cooling influence,
they are also admitting there can be natural sources of warming.

A natural change in ocean circulation is the leading potential explanation for the pause.
Due to the huge temperature difference between surface waters and deep water,
any small change in ocean overturning can result in either warming or cooling of surface temperatures.
If the ocean was isothermal with depth, such a mechanism would not exist.

The point of this post is to remind people of what I have stated before:
to the extent that a change in ocean circulation has negated anthropogenic warming in the last 15+ years,
an opposite change likely enhanced warming during the 1970s to 1990s.

You can't have one without the other. Natural fluctuations in ocean vertical circulation are cyclical.
You can't attribute the recent warming hiatus to natural forcings
without also addressing the role of potential natural forcings in causing the previous warming period.
At best, it betrays a bias in reasoning; at worst, it is logically inconsistent.

Science is not based on models but on authentic measurements. Models must be based on science, not the other way around.

On Earth's atmosphere, CO2 is some 0.06% in volume; surely not enough to cause a catastrophyc greenhouse effect.
According to some global climatologists and the IPCC climate models,
there would be a strong positive feedback action on water vapor amplifying the CO2 effect to be much more potent,
but this theoretical effect has not been measured in practice. It might be very small or even negative.

Understanding that a trace amount of CO2 can not be a main cause of a catastrophyc atmospheric greenhouse effect
means we are more in control of the quality of the air.
We are more responsible for our planet regarding the atmospheric pollution we cause,
and pollution must be minimized for the water and the ground too, and extensive deforestation must cease.
CO2 is not a pollutant; It's the gas of life on Earth!