Posted
by
Unknown Lameron Monday June 23, 2014 @07:15PM
from the but-the-koch-bros-say-it's-a-lie dept.

Freshly Exhumed (105597) writes with news that NOAA's latest global climate analysis is showing things are getting hotter. From the article: Driven by exceptionally warm ocean waters, Earth smashed a record for heat in May and is likely to keep on breaking high temperature marks, experts say. The National Oceanic and Atmospheric Administration Monday said May's average temperature on Earth of 15.54 C beat the old record set four years ago. In April, the globe tied the 2010 record for that month. Records go back to 1880. Experts say there's a good chance global heat records will keep falling, especially next year because an El Nino weather event is brewing on top of man-made global warming. An El Nino is a warming of the eastern tropical Pacific Ocean that alters climate worldwide and usually spikes global temperatures.

People seem to have this reality problem, when they can't figure out that drudge is an aggrigator and posts news from all spectrum's of the isle. From Alexjones to Motherjones and everything in between. I guess reality has a factual bias for you.

Thanks for pointing that out. Its a good thing Christopher Booker (author of your "story") is here to report on NOAA data fabrication.

AND ALSO to let us know that 2nd hand smoke and lung cancer are not related, asbestos poses no risks and the/. favorite...get ready, its a doozy... Evolution is based on BS assumptions and BLIND FAITH and Intelligent Design is the truth!. If you include that last bit when you quote him it will be much easier to keep the stories straight. Your welcome!

Holy smokes are you president of the Christopher Booker fan club?? From his wikipedia page "...he speaks truth to power...". From the 2 minutes of research I've done on him I'd say he speaks truthiness to power. He makes his living as a professional contrarian. All of his pubilished views are the opposite of scientific consensus in a vast variety of subjects. Its possible he could be smarter than all climate scientists, cancer scientists, infectious disease scientists, and Evolutionary Biologists. But there is a much higher likelyhood that he is just full of shit.

I used to set lead type in my parents print shop as a kid, and I'm perfectly fine, even after setting lead type in my parents print shop, it had no effect on me, just like setting lead type in my parents print shop.

This is the same guy [wikipedia.org] that denies the link between passive smoking and cancer, and that asbestos is dangerous. I should introduce him to my friend with mesothelioma [wikipedia.org]. He certainly can't be relied on for the facts.

The question is not only if the climate is changing, but if it's directly related to CO2. Robert Essenhigh's point is quite interesting. http://bit.ly/11IsUri [bit.ly]

It was quite interesting until he failed to explain how heat produces CO2, after claiming that it was easily explainable; when he claimed that a ~5% increase in CO2 release from burning fossil fuel for energy was "statistical noise" and implied that it was the extent of industrial production of CO2, he became a denialist liar. There are numerous other industrial sources of CO2; for example, the production and curing of concrete alone (not accounting for the CO2 release of burning the energy, already accounted for here) accounts for approximately 2.5% of our CO2 emissions. Iron and steel production are likewise carbon-intensive processes, even putting aside the energy consumption. He also doesn't back up his statement that only two possible causes deserve explanation, nor what the four possible causes are, etc etc. He also blames the entire thermal forcing on water vapor, but relative humidity (the only kind of humidity he mentions in the linked page) is decreasing due to rising temperatures.

tl;dr: Essenhigh is trivializing human CO2 production, which exceeds volcanism, and also failing to back up his statements.

1. Decide your position2. When findings are presented that call into question your position, find any flaw with said findings. Preferably one that can't be verified.3. Present flaw as proof that the findings are not only invalid, but through being invalidated prove your position

Outcomes from surveys where technology, sensor placement and encroachment of cities even if super careful have error bars on the same order of signals from multi-decade surveys... are... mostly... useless.

They always result in the same tired predictable rumblings of fools who see what they want.

All the while very important and relatively uncontested facts such as continued decrease of ocean pH and sea level rise are summarily ignored.

Pay attention:1) Visible light hits the earth. Falsifiable, and tested.2) When visible light strikes something, IR is generated. Falsifiable, and tested.3) CO2 is transparent to visible light. Falsifiable, and tested.4) CO2 absorbed energy from IR. Falsifiable, and tested.5) CO2 in the atmosphere is increasing. Falsifiable, and tested.6) The VAST majority of excess CO2 in the air is generated by humans. Falsifiable, and tested.

That's it. That is global warming. If you disagree with that, then you need to prove where the science is wrong. I look forward to your noble prize winning paper.If you read that and still think it doesn't impact the climate(climate change) then you need to show where the absorbed energy is going.

Some of you are very disappointing, falling into ad hom attacks and bad science. Scien that can trivally be checked out. But no, some of ypu moron keeps spouting the same crap.AGW is a scientific fact.

The big spoiler that makes models a bit more difficult is water vapor. It is both more prevalent than CO2 in the atmosphere and more variable in the amount (CO2 is pretty uniform, water vapour varies a lot based on location, time of year, etc). Also when you look at the absorption spectra, water vapour absorbs larger bands of IR than CO2, particularly in the thermal IR region.

That's not the only extra bit of complexity, but it is one that confounds the situation.

Now before you fly off the handle and start screaming and ranting: I'm not challenging the validity of the theory here, I'm just saying you are oversimplifying things. Going after people for being stupid, but then showing ignorance of the complexity of the issue is rather silly.

The problem is that it is complex. If it were a simple system, we'd likely have a very accurate model for it. The complexity is precisely why despite general agreement on the theoretical mechanism of action there are such wide error bars on the predictions.

Water vapor isn't a spoiler - the bands that it absorbs are different from the bands that CO2 absorbs. That's all there is to it really. There's so much water in the atmosphere that water vapor bands are essentially entirely absorbed. We can't reduce the amount of water in the atmosphere, and we wouldn't want to even if we could, so any gains that we make have to be outside the water absorption spectrum.

And how much warming does that "simple" phenomena cause? It's interesting how much effort is devoted to debating the radiative model and how little to whether global warming is a serious threat or not, or if it is, what, if anything we should do about it.

'6) The VAST majority of excess CO2 in the air is generated by humans. Speculative and not agreed'

Nice try, unless you're sitting on a supermassive volcano that nobody knows about, one that's been building for the last 50 years or so. But, that's the easy thing about being an anti-science troll...you just move on the to next canard, no matter if it's one that's been debunked a hundred times over. Because those debunkings haven't been reposted in this conversation. And when they are....you just move on to

The carbon increase in the atmosphere can a) be compared with the amount of fossil fuels burned and b) has a different isotope isotope composition because 14C has only a few thousand year half life so fossil carbon is practically devoid of 14C. Not speculative at all, falsifiable and tested.

Even assuming that the CO2 is natural, the forcing would still be a problem. The idea that the natural CO2 cycle is little-studied is lunatic. Aside from laboratory experiments on CO2 absorption spectra [nist.gov] measuring the "global scale CO2 cycle" is practically the entirety of climate science.

Why is it that the global warming deniers can't decide whether warming isn't happening, it is happening but it isn't human-caused, or it is happening, it is human-caused, but it isn't economical to do anything about it? It can't be all 3, yet the deniers can't seem to get their story straight.

The truth is that it's the 3rd option. Deniers first argue that it isn't happening. When science proves them wrong, they then argue that it is happening but isn't human caused. When science proves them wrong again, they fall back to their real position that despite it existing and being human caused, it isn't worth doing anything about because that would take work and cost money. It's very dishonest.

I am not so sure you understand what "smash" means. The combined average temperature over global land and ocean surfaces in 2010 was 0.72C. The same data for this year is 0.74C. I would hardly call 0.02C as smashing.

To answer my own question, definitely not. But realizing that there's a bit of variability in the data set, this certainly qualifies as an inflammatory description of expected data.
https://www.ncdc.noaa.gov/cag/... [noaa.gov]

When El Nino leads to a new record high temperature by a large margin (for argument's sake, in 2015), the denialists will quietly adopt this as their new standard for 'normal' and in 2025 they'll be saying "warming is a hoax because temperatures haven't risen on average since 2015."

Absolute rubbish! This 'story' goes alllll the way back to 2007 when they realized the measurements were incorrect. Goddard is recycling old news to spread FUD. He's an idiot for not doing the proper research!

I believe I will believe the science being done by scientists. Granted, they are from Texas and Texas is an oil state, but it's up for peer review in the journal "Proceedings of the National Academy of Sciences" and I have yet to find anyone challenging it.

The problem is you take something like the paper you cited and act as if it's the only thing causing ice melt rather than one factor among several. Also, do you have any idea how long those geothermal heat sources have been active? Unless they just became active in the last decade or two how can you say they have much to do with the current melting?

There is nothing in that paper about increasing geothermal flux. Nothing.

Fortunately, we have other places where we can look. It's well known that volcanoes are not constant in behavior or nature. They have spurts of activity. This is universal behavior for any volcano we can observe, even for relatively well-behaved volcanoes like Mauna Loa or Stromboli.

So for volcanic activity in Antarctica, we would, like any other volcanic activity on the planet, expect it to have periods of greater and lesser activity. So while it hasn't been shown that there is increasing geothermal flu

Look at the record since 1880 it rise, levels, rises. levels and so on.If there wasn't man made global warming then there would be rise and a return.Look,m the global warming science is fairly simple. Certain actor in the industry are paid to intentionally make it seem confusing.Pay attention:1) Visible light hits the earth. Falsifiable, and tested.2) When visible light strikes something, IR is generated. Falsifiable, and tested.3) CO2 is

Ummm, not so much. Indications are that warming/cooling has been gong on for billions of years before man started burning anything more than trees, and the levels of C02 in the atmosphere don't seem to have anything to do with those cycles.

Major logical fallacy here. If we know that A causes C, it doesn't automatically mean that B doesn't cause C as well.

The warming and cooling of the past million years was largely driven by orbital changes collectively known as Milankovitch Cycles. [wikipedia.org] But those cycles operate on scales of 10,000 to 100,000 years. The changes in them over the past 200 years have almost zero effect and what effect they do have is trending toward cooling after they reached a peak about 10,000 years ago.

There are multiple carbon cycles around us. One is the obvious, atmospheric one, where rotting plants and animals release CO2 and growing plants sequester it again, on a timescale of dozens of years. Another is a long term one, in which carbon contained in marine sediments gets transported deeper in the tectonic subduction zones, the hydrated carbon-containing rocks then get into areas of high temperature where their hydration decreases the melting point of those rocks and the surrounding environment to the extent that the rocks melt, form magma, and volcanism and CO2 outgassing ensues. That happens on a timescale of millions, or tens of millions of years.

There seems to be a number of feedback loops in this latter process that increase the absorption of CO2 if there is too much of it in the atmosphere and decrease its absorption if there is too little of it, while the volcanoes dump the CO2 back at some random but I guess roughly stable rate (dictated by how fast the tectonic plates deliver the carbon to be outgassed, and this process doesn't care much about what happens on the surface).

And here's the problem: The carbon we're digging up and burning right now had been slowly deposited from the atmosphere into coal and oil over a long period before we started mining it, while these feedback mechanisms had enough time to continuously keep the atmospheric CO2 at some level using these deep crust reservoirs. But now we've been burning the coal and oil quickly, so even though the total amount of carbon in the air, water, continental rocks, marine rocks and generally, the Earth's crust stays the same, and while these feedback loops are stil in effect, we're pumping it from continental rocks (coal, really) into the atmosphere way faster than these long-term cycles can re-deposit it back into the deeper crust.

(But I'm Not A Geologist, so if there is one around and I got it wrong, please correct me on anything.)

When exactly did China and Russia tell you they want to ensure our mutual assured destruction? Ever heard of The Kyoto treaty? I dont think it was China or Russia that caused that to fail.

China agreed to it because they were exempted from everything, and they are now the largest producer of CO2 on the planet. Besides, it was actually the Kyoto protocol itself [ucar.edu] that failed, regardless of who did and didn't sign up for it.

That depends on what you mean by modified. The data have been adjusted to account for a sorts of things that cause problems with the temperature record such as changes in instrument, changes in weather station location, changes in the time of day of observations, changes in the environment around the weather stations. Scientists wouldn't be doing their job if they didn't account for and make adjustments for those issues.

If you think they modified the temperature records just to produce a result they wante

That depends on what you mean by modified. The data have been adjusted to account for a sorts of things that cause problems with the temperature record such as changes in instrument, changes in weather station location, changes in the time of day of observations, changes in the environment around the weather stations. Scientists wouldn't be doing their job if they didn't account for and make adjustments for those issues.

If you think they modified the temperature records just to produce a result they wanted then it's up to you to prove it scientifically.

So you accept that the data is modified. Problem for your clean argument is that it seems that the data has been modified in such a way that increases the global warming trend, by increasing the bias of the data upwards as it gets newer. Without "correction" there is no global warming (or it is minimal at best). I would LOVE to see how that is justified.... I suspect that what we have is circular logic by the activists, mainly because the RAW data is not usually used by most of these studies...

Warming did not stop for 15 years, for chrissakes, and you're either a liar or a moron for restating it.

Not really. Notice the word "warmists" there? It's used as a tribal identifier. In other words, symbolset's post is actually a boast against a perceived other tribe - no different than "your mom's fat". The actual content of the message is not about your mother's body composition, nor Earth's climate, but rather "this is our territory!". It's only an unfortunate accident of evolution that we use the same mechanism for establishing dominance than we use for problem-solving.

It's quite fascinating how much of human communication is utterly unrelated to its nominal content. And it also explains why these discussions tend to degenerate into poo-flinging contest in short order.

What has significantly changed in the oceans so that they do not play a role any other year except when it appears to be cooling or the warming lapsed?

Here is the problem with blaming the oceans. The effects we see aren't new. they were present in the past, they will be present in the future. Separating them when it is convenient is misleading and outright dishonest because you do not separate the effects at other times. It is like saying 2+2 is 4 except when you have less than 5 and some weird unknow

You are way off base there. You have to separate the ocean data because its temperature changes MUCH slower then everything else, ask your neighborhood chemist about specific heat. The problem is that because of this it has a longer lasting effect then the ground or air temps, because it is going to retain that heat into the fall and winter months, keeping those temperatures higher.

Ok, then separate the oceans from the other readings and claims of warming. That is the point, it is either built in, meaning you walk outside and take the temp and it is accurate despite the oceans, or it is not and needs separated completely.

The problem is that the oceans also contribute to the recorded heat. Only considering them separate when it is convenient or when the temp readings are inconvenient is misleading bordering on fraud.

What has significantly changed in the oceans so that they do not play a role any other year except when it appears to be cooling or the warming lapsed?

Sigh...

Are you on about that again? Nothing has significantly changed in the oceans. They still do what they do. What's changed is the level of data we have about them since the Argo floats were deployed starting in 2002.

Global warming = Energy captured by excess CO2Climate change = how global warming impact the climate.Climate disruption= Economic change do to climate change.Te first two came int' use almost at the same time.They are all related but different things. No one is changing anything. I understand the the media confuses the terms, and some pundits use that as some sort of ad hom attack, but you need to be better then that.

Google "Frank Lutz memo", that's where this stupid "change of name" thing originated. The term "global warming" was coined in the seventies to describe observations of climate change, the phrase "climate change" goes back to the 50's. People in the 1950's were not looking for global warming they were trying to explain the ice ages, which cannot be done without understanding the role of CO2. The detected AGW as a consequence of that investigation. It was not until the 1950's that spectroscopes (for heat seeking missiles) became good enough to separate the H20 and CO2 absorption spectra. Until that time the H2O absorption spectra was assumed to overlap the C02 absorption spectra and thus CO2 was thought to have no effect on climate.

Nobody in their right mind is saying that it wasn't cooler 200 years ago; there are enough accounts of both the Thames and Hudson rivers routinely freezing over back then, even if you completely ignore temperature readings. However, thermometers back then were nowhere near as accurate as they are now, and the idea of using those records to get a global temperature that's accurate to.1 degree Celsius from measurements that were, at best, accurate to half a degree is just plain ludicrous.

and the idea of using those records to get a global temperature that's accurate to.1 degree Celsius from measurements that were, at best, accurate to half a degree is just plain ludicrous.

The central limit theorem is ridiculous, who knew? Seriously, for what you claim to be true you must demonstrate that the errors of measurement are not normally distributed across the independent samples, pretty sure there's a nobel prize waiting for whoever saves the world by solving that "conundrum".

It hardly seems credible that with all that world record cold virtually everywhere (except for the Pacific El Nino event), that May could have ALSO been a "record warm" month. It just doesn't add up. Just like so many of NOAA's other figures.... When you have huge areas of the globe showing normal to historical record lows, then in order for the Earth to be "warmer than normal", much less record warm, you would also have to have large areas that were extraordinarily hot during the same period. I have seen

It was the British Empire back then, The Scotch and the Welsh went out and conquered the world too.Remember the 24th Regiment singing Men of Harlech as they fought off the Zulus at Rourke's Drift in 1879

You must have failed your stats 101 course in college. Let me educate you: any random sampling has limited accuracy. A randomly sampling of real temperatures that is made using low quality thermometers will probably have a very limited accuracy. But as you take more samples, you will either find that the measurements are biased or that they are normally distributed. If they are biased, you should be able to model the bias and apply a correction to achieve a normal distribution. Once you have a normal distribution, you know that an increase in the number of samples allows to say with a greater degree of confidence that the real value is within a certain range of the mean of those samples. So, even if you have shitty thermometers, you may be able to draw useful conclusions from them if you have thousands of measurements. This is the kind of stuff that the big boys talk about when they prepare journal articles for peer review. You know, p 0.05 and all that.

Yes, to bring it back to a concrete example that most can understand in baseball your batting average is composed of measurements that are either a 1 or a 0. Yet they commonly calculate batting averages to 3 decimal places. Even if a thermometer is only accurate to +/- 1 degree it's still reasonable to calculate to several decimal places when you combine thousands of measurements.

All well and good, but we're not looking at the absolute temperature, we're looking at the change in temperature. A thermometer is quite likely to give you an inaccurate spot reading but the error will be consistent across all the spot reading from that thermometer. ie: There is much less room for error when looking at the variance over time, which is what we are interested in.

You can demonstrate this quite easily as many others have already done, take the "best" 100 stations listed by the contrarian Ant

Well if we wanted to we could look at global temp history for a long period of time and have it be really accurate.

Of course looking at that we would find that the Earth is in a fairly moderate area between much cooler global temps and much higher global temps.

The earth has been in the not so distant past very uncomfortable for humans in both directions. The earth some day will get much more tropical again. The earth will again see Ice Ages. These are true statements. Nothing we can do in the next 50 years will give us the technological expertise needed to do much of anything about warming or cooling.

Nothing we can do in the next 50 years will give us the technological expertise needed to do much of anything about warming or cooling.

It has been known for a while now that CO2 has been the "thermostat of global climate", for the at least as long a multi-cellular life has been around. In the past CO2 was a natural positive feedback wrt the direction of change, thus amplifying the ice ages changes that are driven by orbital cycles. This effect is still at work today and it's perfectly feasible to control the CO2 concentration to ward of the extremes of these changes. We have been doing this for a while now but in wrong direction, over the next 40yrs we will add half a trillion tons of CO2 into the air, the same amount as what we have put into the air over the last 250yrs, the spike in CO2 due to man is not the highest in the geologic record but it is almost certainly the most rapid change in levels (up 30% in 250yrs and accelerating exponentially ).

It's been clearly shown to all independent observers that we have already significantly altered climate with the first half trillion tons, we can easily observe the North pole melting over the last few decades. So given all that, what makes you think the next half trillion tons won't have an accumulative effect on climate? We could, if we so desired, start scrubbing CO2 from the atmosphere today to restore the climate to the equilibrium our agriculture and civilizations have evolved to suit, the technology is there to do whatever the politics wills with CO2 levels over the next 50yrs.

It's called statistics moron. The thermometers were surprisingly accurate, there were quite a lot of them actually, and by being smart about it, scientist can produced very good estimates of global tems back that far. Yes, the error bars are larger than recent estimates, but therer's nothing controversial or fundamentally difficult about makeing estimates back that far.

You know, there's this funny theorem that in the absence of systematic errors, averaging measurements from multiple instruments (each of them systematically, but randomly-per-instrument overshooting or undershooting) and measurement times (each instrument having a random error here) keeps the mean centered on its true value while decreasing the measurement error.

Or not. Depends on the distribution of the errors. If they truly are random errors that happen to be distributed around the actual value, then yes. But this distribution of errors is not always true.

Also, remember that we are discussing discrete measurements made on unique devices with unknown accuracy/errors which are NOT duplicated, but are for unique locations which we may or may not accurately know, at varying intervals, under conditions which can have huge impact on measurements. You may be able to normalize away some of these variables in some cases, but when you do this for multiple variables, your data set does not improve in accuracy.

Say you had 50 recorded measurements for the same time and place taken by 50 devices/methods you might be able to claim better accuracy than just one, but only if you have a reasonable distribution of measured values. If say 25 of your devices just gave you random numbers (didn't measure anything) your data set will still be corrupted.

But the issue here is NOT accuracy but TRENDS. Given that we are NOT using the same devices for the last 200 years, there will be little you can do with the data in regards to trend information. Through the years we have changed how and where we measure temperatures. This means that the absolute error in each data point will remain because it's about how much things have changed. Problem is, there are things that have changed which have nothing to do with the data. Locations, equipment, and techniques have all changed over the last 200 years, many of these changes are invisible in the data, but can change the trends you see in it.

For instance, Say you measured temperature data in the middle of a grassy field for 200 years. What happens when part of the field gets paved about 70 years ago? Now add in that the area around the field starts to see a lot of buildings and about 50 years ago light aircraft traffic (piston engines). Then in the late 60's that traffic switches over to jet turbines and a lot more of the field gets paved. Now, consider how many data points this might actually be in the USA data set given that a high percentage of "observations" are made at airports these days and tell me how much affect that has on your TREND or how you think you are going to normalize that out of the data?

Or not. Depends on the distribution of the errors. If they truly are random errors that happen to be distributed around the actual value, then yes. But this distribution of errors is not always true. Also, remember that we are discussing discrete measurements made on unique devices with unknown accuracy/errors which are NOT duplicated, but are for unique locations which we may or may not accurately know, at varying intervals, under conditions which can have huge impact on measurements. You may be able to normalize away some of these variables in some cases, but when you do this for multiple variables, your data set does not improve in accuracy.

Every thermometer makes random errors in individual measurements, and every thermometer can have a systematic error, but I'd really like to see a reason why this per-instrument systematic error should be biased in average. Calibrating processes for measurement instruments surely don't work this way.

But the issue here is NOT accuracy but TRENDS. Given that we are NOT using the same devices for the last 200 years, there will be little you can do with the data in regards to trend information.

I don't know about the rest of the world, but I'm pretty sure our local stations keep records of major instrument overhauls which can be taken into consideration. In addition, if we're looking for trends, as you say, the additive systematic error of temperature measurements is automatically cancelled and the importance of the multiplicative systematic error is diminished given that we're looking at small deltas and the measurement error is going to be similarly small, by virtue of being proportional to the delta, so (unless something is eluding me here) the trends shouldn't be obscured by either systematic additive error or systematic multiplicative error of any given measurement instrument.

For instance, Say you measured temperature data in the middle of a grassy field for 200 years. What happens when part of the field gets paved about 70 years ago?

Again, fortunately, we happen to have records of these things so that we can make informed decisions as to which measurements are candidates for exclusion due to human development introducing local environmental biases.

There's a station here that's been continuously recording the daily data meticulously since the 1780s. In the first half of 1800s, there were up to twenty daily measurements there (before the international custom of three measurements per day became established). Of course everything relevant gets recorded and logged. Don't underestimate the power of the Force^H^H^H^H^HOCD.

Just because you think something couldn't possibly be true doesn't mean it isn't.

Thermometer manufacture was precise enough to produce very high quality instruments by the late 1800s. The issue, as GP points out, isn't random errors popping up (thermometers were about as accurate as rulers could be in the late 1800s), but rather whether we know how the instruments were calibrated and how older scales might match up to instruments that were calibrated against modern international standards.

And, at least in the U.S., standard regular calibration standards had been agreed upon by the first couple decades of the 1900s. Good instruments even decades before then should have very little random error -- the question is only whether anyone bothered to check new equipment calibrated to international standards against the old equipment (or sent the old equipment to be tested once such standards were developed).

So, then you have to ask yourself: people who are bothering to meticulously record scientific data continuously for decades on end -- and they're not going to even bother to check whether their old instruments line up properly with new calibration standards?

Sure, I'm positive there are plenty of places that don't have such records. But we have continuous logbooks going back for centuries in many places. The idea that people taking meticulous records wouldn't even bother to check new equipment against old just seems a little ridiculous... not saying it wouldn't happen, but we have plenty of data points where it did happen to extrapolate estimates for error distributions.

You're acting like temperature measurement in the late 1800s was people guessing random numbers or drawing them out of a hat. But that's not what it was like, and there were lots of places with VERY detailed records.

I'm really not sure the whole issue is about temperature rather than heat flux balance. For example, it seems that most new energy absorbed in the climate nowadays goes into the oceans, which may not get immediately reflected in what we perceive as local (or even global) contemporary climate. Better ask your friendly neighborhood qualified climatologist for a more erudite response.

Scant coverage a century ago? Hell, it was scant coverage in the 70's. But that's supposed to be fully accurate too. Never mind that the number of remote monitoring stations has dropped through the floor. Hell my city here in Ontario doesn't even have a monitoring station anymore, everything is "estimated" from nearly 40km away in London, Ontario. At the airport, surrounded by asphalt. We had a monitoring station 15 years ago but it's gone. My sisters town in Alberta? 52km away, in the shadow of a mo

After surfacestations came out with their list scientists decided to check it out. They compared the temperature trends for well sited weather stations to the temperature trends for poorly sited stations. They found that the poorly sited stations actually have a slightly lower temperature trend than the well sited stations.

NOAA used the site ratings by surfacestations.org to construct two national time series. One was the full data set, using all weather stations. The other used only Class 1 or Class 2 weather stations, classified as good or best.

There are certainly a lot more people living on the Earth today than in past millenia. But we also know how to sustain way more people on far less land than was necessary then. Remember world hunger problems are not actually a food production problem, it is a financial and political problem. In fact here in the US one of our more energy and land intensive crops to farm is subsidized heavily to prevent over production and we still turn a lot of it into Ethanol pretty muc

Nope. There are still a number of "skeptics" claiming that the earth is not warming, and most scientists believe we can still avoid the worst of the warming yet to come (though some significant warming is now inevitable). Also, most economic studies of climate impact & mitigation (e.g. the Stern Review) have concluded that it will be much cheaper to mitigate CO2 emissions ASAP and avoid the costs of adaption.

Depends which temperature proxy you look at, but on average, nope [wikimedia.org].

Of course there may have been warmer days in the "recent" past, but we have no records of that, so the article's claim stands. And your claim requires ignoring most of the various proxy reconstructions that have been done, so it doesn't hold up either.

In short the first link looks like a genuine and well done paper but it doesn't say anything about the global temperature. The second link is suspicious as hell, and is either deliberately written to push an agenda, or is a terrible attempt at science that would never pass peer review.

Your ad-hominem was highly predictable. Try responding to the content, instead of looking for nefarious motives. It's a meta-study of other papers. It's well-referenced and you can go read all the papers they cite yourself.

The point is, there are MANY studies (go find your own citations, if you don't like mine) that show that the MWP was, indeed, a global phenomenon. I won't even try to explain why that's not widely reported, nor the entire history of scrubbing the episode out of the IPCC reports.