Some readers may remember the 1961 film “The Day the Earth Caught Fire”. It could be viewed as the original “climate alarmist” film as it contains all of the plot elements of our current climate alarmism scenarios: exaggerated images of a dying planet, a mainstream media newspaper reporter, technology that is feared, the Met Office, and last but not least, junk science.

A new study out of MIT predicts “a 90% probability that worldwide surface temperatures will rise at least 9 degrees by 2100.“
This is more than twice what was expected in 2003. The Telegraph reports

“Global warming of 7C ‘could kill billionsthis century‘. Global temperatures could rise by more than 7C this century killing billions of people and leaving the world on the brink of total collapse, according to new research“A similar 2003 study had predicted a mere- but still significant- 4 degree increase in global temperatures by 2100, but those models weren’t nearly as comprehensive, and they didn’t take into consideration economic factors.

So what has changed since 2003 to cause the scientists at MIT’s “Centre for Global Climate Change” to believe the world is going to boil over this century and send billions of us directly to a toasty demise similar to our featured movie?

Antarctic ice has broken the record for greatest extent ever recorded.http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.area.south.jpg
January, 2008 broke the record for the most snow covered area ever measured in the Northern Hemisphere.http://climate.rutgers.edu/snowcover/png/monthlyanom/nhland01.pngI added a red line below showing the reported projected rise in temperatures from the MIT models, compared with the actual observed temperature trends since the previous 2003 report. Their projections show a correlation of essentially zero.Given that the observed trends are exactly opposite what the MIT models have predicted, one might have to ask what they have observed since 2003 to more than double their warming estimates, and where their 90% confidence value comes from?

The study, carried out in unprecedented detail, projected that without “rapid and massive action” temperatures worldwide will increase by as much as 7.4C (13.3F) by 2100, from levels seen in 2000.

This study has a strong scent of GIGO (garbage, in garbage out.) MIT has one of the world’s preeminent climatologists Dr. Richard Lindzen in their Department of Earth, Atmospheric and Planetary Sciences. I wonder if the scientists at the “Centre for Global Climate Change” checked with him before firing this remarkable piece off to the press?

During the Phanerozoic, CO2 levels have at times been more than 1,500% higher than present, but temperatures have never been more than 10C higher than present. So how does a projected 30% increase in CO2 produce a 7C temperature rise in their models? During the late Ordovician, there was an ice age with CO2 levels about 1000% of current levels. Hopefully the newspaper headlines don’t accurately represent the content of the article.

The United States should not have used the atomic bomb to stop the Japanese militaristic threat during World War II, seeing that it was unnecessary to cause untold suffering unto hundreds of thousands of people in Hiroshima and Nagasaki. Seeing that the United States is nowadays a champion of nuclear warming, it would be ironical if the nation would continue to agree with the logic of using an atomic bomb to end war. Certainly, the atomic bombs used during World War II – Little Boy (especially named for Hiroshima), and Fat Boy (detonated in Nagasaki three days after the Hiroshima bombing) – were deadly, to say the least.[1] The bombs used by the United States served to terrify the Japanese people, and therefore ended the war quicker than previously believed. However, today the United States knows that the cost that was paid by the Japanese people at the expense of a war, was humungous. It should not have happened. What if it happens in our homeland? The photographs that have arrived from Hiroshima and Nagasaki are enough to convince us that the bombing was actually unnecessary (See Appendix). If Mr. Truman were to be asked his opinion today, he might agree, although he might add that it was necessary to check the potency of those bombs for the world to stop using them altogether.

While it is a fact that the world has stopped using atomic bombs after the Hiroshima-Nagasaki bombing, it remains true that it was unnecessary to use the atomic bombs in the first place. It was unnecessary because we knew all along that those bombs are powerfully dangerous. Indeed, the Hiroshima-Nagasaki bombings were a crime against humanity. Needless to say, it is essential to stop such crimes. Thankfully, still, the U.S. has realized its mistakes and today acts a spokesperson for ‘freedom from nuclear proliferation and explosions,’ which Mr. Truman had thought were actually equivalent to the harnessing of universal energies, if not the powers of God, as of the Big Bang. At the same time as the atomic bombing of 1945 acted as a revolution for humanity, and the marriage between technology and human beings – it was a “terrible” disaster. In the words of the then-unapologetic Mr. Truman, the extent of the disaster was also expected:

We have discovered the most terrible bomb in the history of the world. It may be the fire

destruction prophesied in the Euphrates Valley Era, after Noah and his fabulous Ark.

Anyway we “think” we have found the way to cause a disintegration of the atom. An

experiment in the New Mexico desert was startling – to put it mildly. Thirteen pounds of the

explosive caused the complete disintegration of a steel tower 60 feet high, created a crater 6

The experiment was not essential to conduct upon the lives of countless civilians who ended up losing their existence to Mr. Truman’s whim. The United States should simply have shown the New Mexico desert example to the Japanese, and warned them thereby. Science allows for such examples to serve as warnings. In any case, Mr. Truman was successful in that he managed to warn the Japanese alright.[3] As a matter of fact, the Americans promised the Japanese more ruin to come from the air, if the latter failed to concede subsequent to the Hiroshima explosion. Was it not reasonable for the U.S. to have waited more than three days before it also bombed Nagasaki – for the effects of the bomb to show up in greater intensity in Hiroshima, or for the Japanese to simply look upon their damages and surrender? The effects of the bomb were present the first day to boot.[4] Unfortunately, the Japanese did not concede until after the Nagasaki bombing.[5]

According to the Americans, by bombing Hiroshima and Nagasaki, they terrified the Japanese into surrender. However, it can reasonably be argued that the United States should have used its actual scientific testing of the nuclear weapon (in the Mexican desert) to scare the Japanese, instead. The U.S. could have easily reported the scientific testing in the Japanese press. Furthermore, the U.S. should not have bombed Nagasaki after Hiroshima, seeing that the effects of the bomb in Hiroshima were horrible at best. The United States is a nation of people standing by God through their world-famous Declaration of Independence and Constitution. It is quite obvious from news reports about the Hiroshima bombing alone that the attack called for the help of God. In actual fact, the attack was a miserable failure for the United States because it stopped all sense of normal life in Hiroshima in the twinkling of an eye. Quite similar to 9/11, the Hiroshima bombing was enough as warning, even if we were to give the U.S. the benefit of the doubt by assuming that the scientific experiment could not have been enough of a warning for the Japanese. The U.S. should not have gone forward with the Nagasaki bombing after inflicting a disaster similar to 9/11, but bigger in magnitude than 9/11. It was an inhumane mistake.

Fortunately, however, the United States is now wise enough to avoid such disasters in the present and the future. The world knows that the nation is capable of inflicting such a disaster, and other countries are developing similar military power in a race to rule the world. All the same, everybody now understands that it is atrocious to use atomic bombs on other human beings like unto ourselves. It is not only inhumane, but also stupid to use nuclear weapons when scientific experiments (including Hiroshima) have clearly shown the immensity of the damage that these weapons may inflict. It is, moreover, a terrible mistake to be thinking of developing such weapons. Even though they serve as good warning measures, or may be later used in an ice age; atomic bombs are atrocious to employ on people. Lastly, it is essential to realize that it is never necessary to be violent and horrible. Rather, the concepts of peace, love, and brotherhood – all emotional appeals – plus numberless varieties of logical appeals could keep us on the paths of peace and prosperity. In fact, the relationship between U.S. and Japan as it exists today is evidence that the realization has hopefully occurred.

The Sun’s magnetic poles will remain as they are now, with the north magnetic pole pointing through the Sun’s southern hemisphere, until the year 2012 when they will reverse again. This transition happens, as far as we know, at the peak of every 11-year sunspot cycle — like clockwork.

Earth’s magnetic field also flips, but with less regularity. Consecutive reversals are spaced 5 thousand years to 50 million years apart. The last reversal happened 740,000 years ago. Some researchers think our planet is overdue for another one, but nobody knows exactly when the next reversal might occur.

Like the plot of a sci-fi B movie, something weird is happening deep underground where the constant spin of Earth’s liquid metallic core generates an invisible magnetic force field that shields our planet from harmful radiation in space. Gradually, the field is growing weaker. Could we be heading for a demagnetized doomsday that will leave us defenseless against the lethal effects of solar wind and cosmic rays? “Magnetic Storm” looks into our potentially unsettling magnetic future.

Scientists studying the problem are looking everywhere from Mars, which suffered a magnetic crisis four billion years ago and has been devoid of a magnetic field, an appreciable atmosphere, and possibly life ever since, to a laboratory at the University of Maryland, where a team headed by physicist Dan Lathrop has re-created the molten iron dynamo at Earth’s core by using 240 pounds of highly explosive molten sodium. The most visible signs of Earth’s magnetic field are auroras, which are caused by charged particles from space interacting with the atmosphere as they flow into the north and south magnetic poles.

But the warning signs of a declining field are subtler—though they are evident in every clay dish that was ever fired. During high-temperature baking, iron minerals in clay record the exact state of Earth’s magnetic field at that precise moment. By examining pots from prehistory to modern times, geologist John Shaw of the University of Liverpool in England has discovered just how dramatically the field has changed. “When we plot the results from the ceramics,” he notes, “we see a rapid fall as we come toward the present day. The rate of change is higher over the last 300 years than it has been for any time in the past 5,000 years. It’s going from a strong field down to a weak field, and it’s doing so very quickly.”

At the present rate, Earth’s magnetic field could be gone within a few centuries, exposing the planet to the relentless blast of charged particles from space with unpredictable consequences for the atmosphere and life. Other possibilities: the field could stop weakening and begin to strengthen, or it could weaken to the point that it suddenly flips polarity—that is, compasses begin to point to the South Magnetic Pole.

An even older record of Earth’s fluctuating field than Shaw refers to shows a more complicated picture. Ancient lava flows from the Hawaiian Islands reveal both the strength of the field when the lava cooled and its orientation—the direction of magnetic north and south. “When we go back about 700,000 years,” says geologist Mike Fuller of the University of Hawaii, “we find an incredible phenomenon. Suddenly the rocks are magnetized backwards. Instead of them being magnetized to the north like today’s field, they are magnetized to the south.”

Such a reversal of polarity seems to happen every 250,000 years on average, making us long overdue for another swap between the north and south magnetic poles. Scientist Gary Glatzmaier of the University of California at Santa Cruz has actually observed such reversals, as they occur in computer simulations (view one in See a Reversal). These virtual events show striking similarities to the current behavior of Earth’s magnetic field and suggest we are about to experience another reversal, though it will take centuries to unfold.

Some researchers believe we are already in the transition phase, with growing areas of magnetic anomaly—where field lines are moving the wrong way—signaling an ever weaker and chaotic state for our protective shield.

Geophysicist Rob Coe, also of the University of California at Santa Cruz, may have even found a lava record in Oregon that charts the magnetic mayhem that ensues during a period of reversal. The picture that emerges may not be up to Hollywood disaster standards, but considering that human civilization has never had to cope with such a situation before, it could be an interesting and challenging time.