paperburn1 wrote:You make one incredibly important point. currently we are in a state that our population density is out of proportion with our available energy. If history has shown us anything its that when you reach this level in development; your civilization must find a denser source of energy or fall back to a more sustainable level. Maybe this is the real answer to the Fermi paradox.

We are no where close to exceeding our supply of available energy. Anyone on this forum of all places should know that. Fission alone would enable us to continue our rate of expansion for another few millennia. The only issue, and it was a short term one, was that political entities back in the 70's decided Thorium wasn't strategically useful for a national energy policy. That is now changing as we speak, not because those political entities have decide otherwise, but because other countries have finally caught up and are themselves pursuing that technology. It's now a piss or get off the pot moment for western countries as very soon everyone will be buying their power from China and that creates a whole other level of strategic problems. If the political powers in western nations don't fix their sh!t soon, they won't be there anymore to enjoy the perks they have become accustomed to.

What ever is going on in climate - it does not look like CO2 is the driver. In any case it also looks like a 10 degC rise would not be catastrophic.

It's cute seeing the green whacko's trying to justify the debunked "Global Alarmism!!!" theory. At this point they are just trying to keep themselves convinced. It's been shown to be nothing but a power grab from the left in order to take control of energy production and thus civilization, was resisted long enough for reality to debunk theory. But hey if reality disagrees with you, just change the measurements, we all know it's for a good cause so no harm done!

Remember all that "missing heat" that the left whackos tried to say was hiding deep in the ocean waiting to jump out and surprise us? Turns out it wasn't hiding at all but was there all along, we just weren't reading the thermometer right.

We narrowly missed a new ice age, and now we won’t see one for a long time

Before fossil fuels rendered this moot, conditions were nearly right.

Since people are often naturally curious about the future of the ice age cycle, the reality bears repeating: we broke it.

Recorded human history has played out within one type of climate—an interglacial period. During the glacial periods of the last million years (commonly referred to as “ice ages”), great ice sheets grew to cover Canada and some points south, as well as Northern Europe and much of Russia.

In the 1970s, we learned there was a consistent 100,000-year heartbeat to this back-and-forth cycle governed by subtle patterns in Earth’s orbit. The thing is, it’s about time for the next heartbeat. We’re at the part of the cycle where the interglacial period should be wrapping up and the slow but inexorable descent into another ice age would begin.

But that hasn’t happened, and it’s not going to any time soon. Our current breakneck emissions of greenhouse gases will see to that. Still, the scientific question is worth asking: what, exactly, does it take to start off an ice age?

We are currently at a low point in summer sunlight reaching the northern high latitude region, which is how the orbital cycles turn into glacial cycles. Because there are several orbital cycles involved, the peaks and valleys in that sunlight are complex—it’s not as simple as a sine wave oscillating between a constant high and a constant low. But there were two interglacial periods in the last million years (one 400,000 years ago and one 800,000 years ago) with a similar combination of orbital cycles. Both crossed the threshold into an ice age when they hit this low point in sunlight.

To compare those two time periods with the present day, Potsdam Institute for Climate Impact Research scientists Andrey Ganopolski, Ricarda Winkelmann, and Hans Joachim Schellnhuber ran a pile of model simulations. They found a consistent relationship between the amount of sunlight and the concentration of atmospheric CO2 that allowed simulations to start an ice age.

At the end of the interglacial 800,000 years ago, the low in sunlight was about the same as today, but CO2 was at 240 parts per million—prior to the industrial revolution, our CO2 concentration was 280 parts per million. At the end of the interglacial 400,000 years ago, CO2 was also about 280 parts per million, but an ice age started because the low point in sunlight was just a bit lower. The researchers conclude that we narrowly missed an ice age off-ramp in the past few thousand years because CO2 was just a touch too high. Lower the concentration by just 40 parts per million in the model, and ice sheets would already be growing by now—though the fossil fuel revolution would still be dictating a planetary U-turn.

There are a couple interesting things to note about this. One is that it’s possible humans were responsible for higher CO2 concentrations even before the industrial revolution. There’s a debate among climate scientists about whether the advent of agriculture and deforestation had a significant impact thousands of years ago, forestalling the beginning of an ice age as a result.

Whether or not humans were responsible for CO2 being slightly too high, there’s another interesting implication. With CO2 at 280 parts per million, the next opportunity to cross into an ice age is about 50,000 years away in the models. That would make the present interglacial period longer than any in the last million years.

That is, however, somewhat academic given our current massive-scale experiment with the climate system. To investigate more relevant scenarios, the researchers ran simulations of three futures: one in which we basically stop emitting CO2 now, one in which we emit double what we have so far, and one where we triple it. (If we do nothing to reduce emissions, we’ll hit quadruple by the year 2100.) In the low-emissions scenario, we skip any real ice sheet growth for at least 50,000 years. In the high-emissions scenario, there’s basically no chance of dropping into any kind of ice age within the next 100,000 years, which was as long as the simulations ran. That’s because it takes a very long time for CO2 concentrations to naturally decline.

Most of these conclusions have been reached by one or another study in the past, but the sunlight/CO2 relationship that sets the ice age threshold is new, and shows how close we came in the last few millennia. Since people are often naturally curious about the future of the ice age cycle, the reality bears repeating: we broke it.

Now they say 2015 is the hottest year on the books. When most likely really sometime in the '30's was. They've adjusted the temperature record to agree with AGW theory, without access to the unadjusted original data, who could tell?

TDPerk wrote:The whole idea is utter crap.Now they say 2015 is the hottest year on the books. When most likely really sometime in the '30's was. They've adjusted the temperature record to agree with AGW theory, without access to the unadjusted original data, who could tell?

More from the same/related theme:

Carbon emissions 'postpone ice age'

Earth has been through a cycle of ice ages and warm periods over the past 2.5 million years

The next ice age may have been delayed by over 50,000 years because of the greenhouse gases put in the atmosphere by humans, scientists in Germany say. They analysed the trigger conditions for a glaciation, like the one that gripped Earth over 12,000 years ago.The shape of the planet's orbit around the Sun would be conducive now, they find, but the amount of carbon dioxide currently in the air is far too high. Earth is set for a prolonged warm phase, they tell the journal Nature. "In theory, the next ice age could be even further into the future, but there is no real practical importance in discussing whether it starts in 50,000 or 100,000 years from now," Andrey Ganopolski from the Potsdam Institute for Climate Impact Research said.

"The important thing is that it is an illustration that we have a geological power now. We can change the natural sequence of events for tens of thousands of years," he told BBC News.

Earth has been through a cycle of ice ages and warm periods over the past 2.5 million years, referred to as the Quaternary Period.

This has seen ice sheets come and go. At its maximum extent, the last glaciation witnessed a big freeze spread over much of North America, northern Europe, Russia and Asia.

In the south, a vast expanse of what are now Chile and Argentina were also iced up.

Planet rock

A fundamental parameter determining what dips Earth into an ice age is the changing nature of its orbit around the Sun.

The passage around the star is not a perfect circle and over time our planet's axis of rotation also rocks back and forth.

These movements alter the amount of solar radiation falling on the Earth's surface, and if a critical threshold is reached in mid latitudes in the Northern Hemisphere then a glaciation can be initiated.

Dr Ganopolski colleagues confirm this in their modelling but show also the role played by the concentration of greenhouse gases in the atmosphere.

And one of their findings is that Earth probably missed the inception by only a narrow margin a few hundred years ago, just before the industrial revolution took hold.

"We are now in a period when our (northern) summer is furthest from the Sun," the Potsdam researcher explained.

"Under normal circumstances, the interglacial would be terminated, and a new ice age would start. So, in principle, we are in the perfect conditions from an astronomical point of view. If we had a CO2 concentration of 240 parts per million (200 years ago) then an ice age could start, but luckily we had a concentration that was higher, 280ppm." Today, industrial society has taken that concentration to over 400ppm.

Fast metabolism

The team says that an interglacial climate would probably have been sustained anyway for at least 20,000 years, and, very probably, for 50,000 years, even if CO2 had stayed at its eighteenth century level.

But the almost 500 gigatonnes of carbon that has been released since the Industrial Revolution means we will likely miss the next best astronomical entry point into a glaciation, and with a further 500 gigatonnes of emissions the "probability of glacial inception during the next 100,000 years is notably reduced", the scientists say in their Nature paper.

Add a further 500 Gt C on top of that and the next ice age is virtually guaranteed to be delayed beyond the next 100,000 years.

Commenting on the study, Prof Eric Wolff from the University of Cambridge, UK, said: "There have been previous papers suggesting that the next ice age is many tens of thousands of years away, and that the combination of seasonal solar energy at the latitude where an ice sheet would form, plus CO2, is what determines the onset of an ice age. But this paper goes much further towards quantifying where the limits are.

"It represents a nice confirmation that there is a relatively simple way of estimating the combination of insolation and CO2 to start an ice age," he told the Science Media Centre.

And Prof Chris Rapley, from University College London, added: "This is an interesting result that provides further evidence that we have entered a new geological [Epoch] - 'The Anthropocene' - in which human actions are affecting the very metabolism of the planet."

Sillyness. Scientists aren't in the business of making up false data in order to get paid, and the ones who are get exposed and discredited. Conspiracy theories are the operation of human pattern recognition instincts modulated by the desire not to believe something the human in question doesn't like.

Get over it.

We need a directorate of science, and we need it to be voted on only by scientists. You don't get to vote on reality. Get over it. Elected officials that deny the findings of the Science Directorate are subject to immediate impeachment for incompetence.

I have been reading a lot on Geo engineering tests and it looks like it would be a very easy and inexpensive way to lower the amount of CO2 in the air if we wanted to try.LOHAFEX LOHAFEX was not the first experiment of its kind. In 2000 and 2004, comparable amounts of iron sulfate were discharged from the same ship ( EisenEx experiment). 10 to 20 percent of the algal bloom died off and sank to the sea floor. This removed carbon from the atmosphere, which is the intended carbon 'sink'.As expected iron fertilization led to development of a bloom during LOHAFEX, but the chlorophyll increase within the fertilized patch, an indicator of biomass, was smaller than in previous experiments. The algal bloom also stimulated the growth of zooplankton that feed on them. The zooplankton in turn are consumed by higher organisms. Thus, ocean fertilization with iron also contributes to the carbon-fixing marine biomass[5] of fish species which have been removed from the ocean by over-fishing

Haida Gwaii July 2012, the Haida Salmon Restoration Corporation dispersed 100 short tons (91 t) of iron sulphate dust into the Pacific Ocean several hundred miles west of the islands of Haida Gwaii. The Old Massett Village Council financed this project as a salmon enhancement project with $2.5 million in village funds.[8] The concept was that the formerly iron-deficient waters would produce more phytoplankton that would in turn serve as a "pasture" to feed salmon. Then- CEO Russ George hoped to sell carbon offsets to recover the costs. The project was plagued by charges of unscientific procedures and recklessness. George contended that 100 tons of iron is negligible compared to what naturally enters the ocean.[9]The 2013 salmon runs defied all expectations, more than quadrupling, from 50 million to 226 million fish.[11]On 15 July 2014, the oceanographic scientific data that has been gathered during the project were made publicly available under the ODbL licensehttps://en.wikipedia.org/wiki/Ocean_fertilization

paperburn1 wrote:I have been reading a lot on Geo engineering tests and it looks like it would be a very easy and inexpensive way to lower the amount of CO2 in the air if we wanted to try.

Not only reduce CO2 but even more importantly greatly increase the food supply by producing lots of high value fish protein; believe it is called open ocean aqua-farming or something. Which of course is why they (the climate change people) hate it intensely; they would rather IMHO see people starve than be saved by some evil capitalist "techno-fix". Imagine all the profits that would be made in greatly increasing fish yields. The other reason is because "easy" and "inexpensive" fixes don't fit the agenda of the climate change fanatics...they want "solutions" that are difficult and very expensive and anti-capitalist; they want people to suffer, to punish them/us for our greedy selfish profligate waste of Gaia's resources. Better to serve their agenda of an eventual world socialist leftist government. I wonder if that could be why our polywell which got a glowing review as to potential back in 2013 has been apparently effectively mothballed in the three years since; is it possible I wonder if the Navy was pressured by someone high up in the Obama administration to basically sit on it? They would rather pour more money into Solyndra or something. Wonder if Trump knows about Polywell?

Here are 10 of the many scientific problems with the assumption human activity is causing “global warming” or “climate change”:

1. Temperature records from around the world do not support the assumption that today’s temperatures are unusual.

The all-time high temperature record for the world was set in 1913, while the all-time cold temperature record was set in 1983. By continent, all but one set their all-time high temperature record more recently than their all-time cold temperature records. In the United States, which has more weather stations than any other location in the world, more cold temperature records by state were set more recently than hot temperature records. When the temperature records for each state were considered for each month of the year, a total of 600 data points (50 states x 12 months), again cold temperature records were set in far greater numbers more recently and hot temperature records were set longer ago. This is directly contradictory to what would be expected if global warming were real.

2. Satellite temperature data does not support the assumption that temperatures are rising rapidly:

Starting at the end of 1978, satellites began to collect temperature data from around the globe. For the next 20 years, until 1998, the global average temperature remained unchanged in direct contradiction to the earth-bound weather station data, which indicated “unprecedented” temperature increases. In 1998 there was a strong El Nino year with high temperatures, which returned to pre-1998 levels until 2001. In 2001 there was a sudden jump in the global temperature of about 0.3 degrees centigrade which then remained at about that level for the next 14 years, with a very slight overall decrease in the global temperatures during that time.

3. Current temperatures are always compared to the temperatures of the 1980’s, but for many parts of the world the 1980’s was the coldest decade of the last 100+ years:

If the current temperatures are compared to those of the 1930’s one would find nothing remarkable. For many places around the world, the 1930’s were the warmest decade of the last 100 years, including those found in Greenland. Comparing today’s temperatures to the 1980’s is like comparing our summer temperatures to those in April, rather than those of last summer. It is obvious why the global warming community does this, and very misleading (or deceiving).

4. The world experienced a significant cooling trend between 1940 and 1980:

Many places around the world experienced a quite significant and persistent cooling trend to the point where scientists began to wonder if the world was beginning to slide into a new ice age period. For example, Greenland experienced some of the coldest years in 120 years during the 1980’s, as was the case in many other places around the world. During that same 40-year period, the CO2 levels around the world increased by 17%, which is a very significant increase. If global temperatures decreased by such a significant amount over 40 years while atmospheric CO2 increased by such a large amount we can only reach two conclusions: 1. There must be a weak correlation, at best, between atmospheric CO2 and global temperatures, 2. There must be stronger factors driving climate and temperature than atmospheric CO2.

5. Urban heat island effect skews the temperature data of a significant number of weather stations:

It has been shown that nighttime temperatures recorded by many weather stations have been artificially raised by the expulsion of radiant heat collected and stored during the daytime by concrete and brick structures such as houses, buildings, roads, and also cars. Since land area of cities and large towns containing these weather stations only make up a very small fraction of the total land area, this influence on global average temperature data is significant. Since the daytime and nighttime temperatures are combined to form an average, these artificially-raised nighttime temperatures skew the average data. When one only looks at daytime temperatures only from larger urban areas, the “drastic global warming” is no longer visible. (This can also be seen when looking at nearby rural area weather station data, which is more indicative of the true climate of that area).

6. There is a natural inverse relationship between global temperatures and atmospheric CO2 levels:

Contrary to what would be assumed when listening to global warming banter or while watching An Inconvenient Truth, higher temperatures increase atmospheric CO2 levels and lower temperatures decrease atmospheric CO2 levels, not the other way around. Any college freshman chemistry student knows that the solubility of CO2 decreases with increasing temperatures and thus Earth’s oceans will release large amounts of CO2 to the atmosphere when the water is warmer and will absorb more CO2 when the water is colder. That is why the CO2 level during the ice ages was so much lower than the levels today. That doesn’t take away the fact that we are artificially raising the atmospheric CO2 levels, but just because we do, that doesn’t mean that this will cause temperatures to increase in any significant way. The 40-year cooling period between 1940 and 1980 appear to support that premise. What we can conclude is that the ice ages were not caused by changes in the atmospheric CO2 levels and that other stronger factors were involved with these very large climate changes.

7. The CO2 cannot, from a scientific perspective, be the cause of significant global temperature changes:The CO2 molecule is a linear molecule and thus only has limited natural vibrational frequencies, which in turn give this molecule only limited capability of absorbing radiation that is radiated from the Earth’s surface. The three main wavelengths that can be absorbed by CO2 are 4.26 micrometers, 7.2 micrometers, and 15.0 micrometers. Of those 3, only the 15-micrometer is significant because it falls right in range of the infrared frequencies emitted by Earth. However, the H2O molecule which is much more prevalent in the Earth’s atmosphere, and which is a bend molecule, thus having many more vibrational modes, absorbs many more frequencies emitted by the Earth, including to some extent the radiation absorbed by CO2. It turns out that between water vapor and CO2, nearly all of the radiation that can be absorbed by CO2 is already being absorbed. Thus increasing the CO2 levels should have very minimal impact on the atmosphere’s ability to retain heat radiated from the Earth. That explains why there appears to be a very weak correlation at best between CO2 levels and global temperatures and why after the CO2 levels have increased by 40% since the beginning of the industrial revolution the global average temperature has increased only 0.8 degrees centigrade, even if we want to contribute all of that increase to atmospheric CO2 increases and none of it to natural causes.

8. There have been many periods during our recent history that a warmer climate was prevalent long before the industrial revolution:

Even in the 1990 IPCC report a chart appeared that showed the medieval warm period as having had warmer temperatures than those currently being experienced. But it is hard to convince people about global warming with that information, so five years later a new graph was presented, now known as the famous hockey stick graph, which did away with the medieval warm period. Yet the evidence is overwhelming at so many levels that warmer periods existed on Earth during the medieval warm period as well as during Roman Times and other time periods during the last 10,000 years. There is plenty of evidence found in the Dutch archives that shows that over the centuries, parts of the Netherlands disappeared beneath the water during these warm periods, only to appear again when the climate turned colder. The famous Belgian city of Brugge, once known as “Venice of the North,” was a sea port during the warm period that set Europe free from the dark ages (when temperatures were much colder), but when temperatures began to drop with the onset of the little ice age, the ocean receded and now Brugge is ten miles away from the coastline. Consequently, during the medieval warm period the Vikings settled in Iceland and Greenland and even along the coast of Canada, where they enjoyed the warmer temperatures, until the climate turned cold again, after which they perished from Greenland and Iceland became ice-locked again during the bitter cold winters. The camps promoting global warming have been systematically erasing mention of these events in order to bolster the notion that today’s climate is unusual compared to our recent history.9. Glaciers have been melting for more than 150 years

The notion of melting glaciers as prove positive that global warming is real has no real scientific basis. Glaciers have been melting for over 150 years. It is no secret that glaciers advanced to unprecedented levels in recent human history during the period known as the Little Ice Age. Many villages in the French, Swiss, and Italian Alps saw their homes threatened and fields destroyed by these large ice masses. Pleas went out to local bishops and even the Pope in Rome to come and pray in front of these glaciers in the hope of stopping their unrelenting advance. Around 1850, the climate returned to more “normal” temperatures and the glaciers began to recede. But then between 1940 and 1980, as the temperatures declined again, most of the glaciers halted their retreat and began to expand again, until warmer weather at the end of the last century caused them to continue the retreat they started 150 years earlier. Furthermore, we now know that many of the glaciers around the world did not exist 4000 to 6000 years ago. As a case in point, there is a glacier to the far north of Greenland above the large ice sheet covering most of the island called the Hans Tausen Glacier. It is 50 miles long ,30 miles wide and up to 1000 feet thick. A Scandinavian research team bored ice cores all the way to the bottom and discovered that 4000 years ago this glacier did not exist. It was so warm 4000 years ago that many of the glaciers around the world didn’t exist but have returned because of the onset of colder weather. Today’s temperatures are much lower than those that were predominant during the Holocene era as substantiated by studying the many cores that were dug from Greenland’s ice sheet.

10. “Data adjustment” is used to continue the perception of global warming:

For the first several years of my research I relied on the climate data banks of NASA and GISS, two of the most prestigious scientific bodies of our country. After years of painstaking gathering of data, and relentless graphing of that data, I discovered that I was not looking at the originally gathered data, but data that had been “adjusted” for what was deemed “scientific reasons.” Unadjusted data is simply not available from these data banks. Fortunately I was able to find the original weather station data from over 7000 weather stations from around the world in the KNMI database. (Royal Dutch Meteorological Institute). There I was able to review both the adjusted and unadjusted data as well as the breakout of the daytime and nighttime data. The results were astounding. I found that data from many stations around the world had been systematically “adjusted” to make it seem that global warming was happening when, in fact, for many places around the world the opposite was true. Following will be a few of the myriad of examples of this data adjustment. When I present my material during presentations at local colleges, these are the charts that have some of the greatest impact in affecting the opinion of the students, especially when they realize that there is a concerted effort to misrepresent what is actually happening. Another amazing result was that when only graphing the daily highs from around the country, a very different picture arises from the historical temperature data.

There are many more specific areas that I have researched and for which I have compiled data and presentation material, equally compelling regarding at exposing the fallacies of global warming. A new twist has swept the global warming movement lately, especially since they had to admit that their own data showed that there was a “hiatus” on the warming, as illustrated in the 2014 IPCC report; their data showed an actual cooling over the last 10 years. The new term: “climate change” is now taking over, such that unusual events of any kind, like the record snowfall in Boston, can be blamed on the burning of fossil fuels without offering any concrete scientific data as to how one could cause the other.