AGU Fall Meeting

I’m leaving tomorrow for San Francisco and will be presenting at the 8 am Union session 11-B on Monday morning. It takes me a long time to prepare short presentations. When I look at them, I wonder why it took so long.

Al Gore is heading an AGU session on Thursday. If the convention center has wireless connections, I’ll try to post up some comments on sessions while I’m there but no promises.

“Mankind has had less effect on global warming than previously supposed, a United Nations report on climate change will claim next year.

The UN Intergovernmental Panel on Climate Change says there can be little doubt that humans are responsible for warming the planet, but the organisation has reduced its overall estimate of this effect by 25 per cent.

I just noticed it. But it’s actually a pretty tricky quote. They’re implying that just because the warmers are refining / backtracking a bit that skeptics should be satisfied rather than calling for more study before making major changes. In fact we’re claiming much more than that. The whole warmer program is in need of major auditing and overhaul. Who knows what the final results will be but we won’t be satisfied with a tweak in the doomsday predictions which are constantly presented.

From this quote it sure looks like they intend to keep pushing their scare tactics tho’.

Scientists insist that the lower estimates for sea levels and the human impact on global warming are simply a refinement due to better data on how climate works rather than a reduction in the risk posed by global warming.

Wasn’t the supposed catastrophic rise in sea level part of the risk? What counters the reduced sea level estimate that keeps the purported risk from being reduced?

Large amounts of heat have been absorbed by the oceans, masking the warming effect.

Prof Rick Battarbee, the director of the Environmental Change Research Centre at University College London, warned these masking effects had helped to delay global warming but would lead to larger changes in the future.

He said: “The oceans have been acting like giant storage heaters by trapping heat and carbon dioxide. They might be bit of a time-bomb as they have been masking the real effects of the carbon dioxide we have been releasing into the atmosphere.

“People are very worried about what will happen in 2030 to 2050, as we think that at that point the oceans will no longer be able to absorb the carbon dioxide being emitted. It will be a tipping point and that is why it is now critical to act to counter any acceleration that will occur when this happens.”

Prof Battarbee ticks the boxes: “time bomb”, tipping point”, “very worried”, “critical”, “acceleration”. Also says oceans are storing heat. This is his way of obscuring the divergence problem.

Looks like Steve McIntyre is moving the debate forward with discussion of a topical development: “The Impact of NRC Recommendations on Climate Reconstructions”

Michael Mann on the other hand remains a little more stuck in a rut with: “Proxy-Based Reconstructions of Past Hemispheric and Global Mean Surface Temperature Variations”

Meanwhile, Gavin Schmidt breaks ranks with the hockey team by assuming the debate is not over with “Science blogging: RealClimate.org and the Global Warming debate”

And I am sure we can count on the Honorable Al Gore in his presentation: “The Role of Science and the Media in Policy Making” to present a little more of the wisdom we saw in his recent newspaper article, where he stated: “There is a reason why new scientific research is peer-reviewed and then published in journals such as Science, Nature, and the Geophysical Research Letters, rather than the newspapers. The process is designed to ensure that trained scientists review the framing of the questions that are asked, the research and methodologies used to pursue the answers offered and even, in some cases, to monitor the funding of the laboratories ‘€” all in order to ensure that errors and biases are detected and corrected before reaching the public.” (Sunday Telegraph, 19 Nov 2006, UK). But then, as he suggests himself, we shouldn’t believe all we read in the newspapers.

The most common reason for the funny question marks are the left- and right-slanted quotation marks which some word processors use to replace the standard vertical quotation marks (“). This can be set by a preference, in Word it’s under “Autocorrect”.

#20: John A, two guesses:
a) One of the co-authors is presenting the paper
b) The session chair: Unfortunately, we are running out of time, and have time only for one question. Yes, the genteleman back there (pointing to the completely other direction than to Mr. McIntyre).

RE: #6 – Tempting … although with this flu I am getting over, I’d probably only be able to justify meeting up with y’all if you’d bring Algore, Mann and Gavin along, for some beers and spirited discussion. Now *that* would be amazing!😉

SteveM just finished (Mann, Nychka, others, already spoke; I’ll comment more later). Steve presented many of the graphs that we have seen here on CA, and raised a lot of troubling issues about the proxy records. There were a few “comments” — other speakers were allowed “questions” (not sure why North handled Stevem differently) — and Malcolm Hughes took the opportunity to offer some extremely rude words about “not knowing where to start”, how SteveM’s approach to Yamal is based on misunderstanding and it would earn him a C as a graduate student, and needing a whole book to respond to the “innuendo”. Michael Mann (predictably) stated that they’ve “moved on” and the results no longer rely on the 1998 paper (except, as SteveM showed, they all use the same proxies).

Wilson just spoke in divergence problem — “has to be addressed” — and “MORE DATA ARE NEEDED”. Points out that mean temps may not be the right measure/maximums might be better. Mann made comment (I missed it).

A comical interlude: Ritson brought overheads — the old-fashioned translucent sheets that one puts onto a projector — and the AV people can’t get it to work. I guess everyone has forgotten the old ways. All I’ve been able to glean so far is that Ritson has a typo in his title. Oops!

…Malcolm Hughes took the opportunity to offer some extremely rude words about “not knowing where to start”, how SteveM’s approach to Yamal is based on misunderstanding and it would earn him a C as a graduate student, and needing a whole book to respond to the “innuendo”.

This is standard Hockey Team obfuscation: how many times have you read on RealClimate to the effect that “there’s so many mistakes its difficult to know where to begin…blah blah blah”. Somehow the Hockey Team have got to “move on” from these tedious clichés and start answering criticisms with something substantive.

#28 Jean S,
There were probably around 350 people during the first part of the session, in which Steve spoke. That includes a fair number of people standing at the back and sides of the room. It was completely full.

Steve M – Very nice slide deck. Judging from some of the obnoxious responses (love that C grad student smear) all the right buttons were pushed. Hopefully the real geological scientists there were savvy to what was going on. Was Ken MacDonald there? Presenting?

Academia amazes my with their rudeness. When I returned to my old university to award thesis support grants I looked over the audience of grad students and faculty and told the students that the biggest difference they would encounter as a geologist in industry would be that even if someone disagreed with you, they would always be polite. I guess they need to hear that speach more often. Congratulations on your C grade. Perhaps you should do an ivestor’s audit for the esteemed professor.

While certainly a few slights back and forth can be expected when one is showing flaws with another’s work. However, in the end, Steve’s ultimate criticism lies with the team’s work, and the team’s ultimate criticism lies with Steve. It is sad that these people think such behavior is OK.

In graduate school a “C” is not a grade to be congratulated about. At least when I was going you couldn’t even count a C as credit toward your requirement if it was in your area of study. So in essence he was saying that if Steve was studying in Climatology he’d flunk.

It will be fascinating to hear from the participants how it went. One of the things I am seeing is that steve got to give a platform presentation at a big scientific meeting at the same session as Mann, Hughes, et al. From the sound of it, SM can no longer be dismissed out of hand, and instead, the big boys are actually having to defend the stance they have taken; with a roomful of 350 scientists, there is only so far you can go with bluster. Someone will point out that the emperor isn’t wearing any clothes !

I also don’t understand Hughes reported comments. SM is showing data that has been published in three independent papers; if the argument is that one of these papers is seriously flawed, then Hughes should contact the authors, and they should organise a retraction. If they aren’t flawed, I don’t see why SM can’t compare them in the context of reconstructions.

It was a very interesting session. I’m at the poster session now, about 25 feet from SteveM and Michael Mann (both chatting but in different groups).

Wilson’s comments on divergence were clear; it presents a real and unresolved problem. There were also some interesting results presented related to sea-surface temperature; it seems there is pretty strong evidence for a MWP in tropical Pacific proxies.

Most of the scientists here seem to agree that there is a lot more uncertainty about AGW/climate than we usually hear about. No surprise there, I guess.

Hi, everyone. I should try to write a summary while it’s fresh in my mind, but it’s late and I’m tired and I h ad a nice dinner with good company. A couple of quick notes. There’s some sort of schmozzle with how Mann standardized the proxies in his RegEM method. In one article, they standardized on the calibration period and in another they calibrated on the entire period. It seems to make a difference. Mann said that he had “moved on” to Truncated Total Least Squares.

Mann said the he had done calculations with and without tree rings and got the same result. It’s funny that doing calculations without bristlecones or North American tree rings in the 15th century is an egregious error if this is the case, but it’s the Team. (When I walked by the Hilton, there was a sign saying Team Entrance Only so I guess that’s where they are staying.)

When I collected the examples, I didn’t look at who else was presenting. It was a little uncanny that several results that intrigued me sufficiently to post on the blog were on our session. I mentioned the underwater trees in the Sierra Nevadas a few days ago. In the 2nd morning session, Biondi (a NAS panelist) preented the underwater trees including one picture that I’d shown; Cuffey also referred to California mega-droughts. (Hughes was a co-author on the Biondi study.)

I posted up Alicia Newton’s interesting series in OCtober and showed her series in the panel on ocean sediment series. She was on the 2nd morning panel and gave an excellent presentation. Mann made a know-it-all comment to her that coretops did not reflect recent warming. In a polite southern way, she stuffed him by saying that the comparison data was collected in 1990-94.

Julie Richie of USGS also gave an excellent presentation on Pigmy Basin, Gulf of Mexico. This is the series that Keigwin liked. I started a post on this (they have a poster on the web – I don’t have the url handy) so I was familiar with it. She was very professional – they estimated MWP warmer than modern. They replicated the results in two different species G ruber white and G ruber pink. I chatted with her at the poster session and complimented her. Their paper is going to be published in Geology early next year. It sounds like one reviewer was hugely antagonistic and the paper experienced scrutiny and delays uncharacteristic of Mg/Ca papers. Since the analyses had been meticulously done, the antagonism was POV. It was probably Bradley or someone like that.

I think that new data is what will resolve this topic and the two young women are on the right track. There’s zero point re-working the same data over and over.

IT was pointed out to me that Hughes’ outburst met with stony silence. I wasn’t offended by the rudeness – I’m pretty used to it by now. That type of outburst usually says more about the outburster than the outburstee, so I doubt that Hughes’ outburst enhanced Team stature in third party eyes.

As to the Yamal comments to which Hughes took exception, by and large, I don’t editorialize very much and my comments tend to be pretty factual. I believe that to be true in this case as well. Briffa 1995; Briffa 2000 and the Polar Urals update are 3 different chronologies with different impact. All three have been used in at least one multiproxy reconstruction. I’m baffled as to what Hughes objected to, other than my existing.

I think Steve’s presentation is excellent (from a look at the slides). Dare I say, it appears clearer and more succinct than others I have seen. To get your message across in such complex arguments, you need a degree of simplicity.

The BBC has decided on its angle, leading their worldwide website with: “Arctic sea ice ‘faces rapid melt”‘

The story reads: “The Arctic may be close to a tipping point that sees all-year-round ice disappear very rapidly in the next few decades, US scientists have warned. The latest data presented at the American Geophysical Union Fall Meeting suggests the ice is no longer showing a robust recovery from the summer melt. Last month, the sea that was frozen covered an area that was two million sq km less than the historical average. “That’s an area the size of Alaska,” said leading ice expert Mark Serreze.”

That their so-called ‘Science reporter’, Jonathon Amos should show such outright bias in his selection of a particular side of the argument is unacceptable. I will be sending them a complaint.

#26. Aside from not getting his acetate to work, Ritson was incoherent. It was pretty embarrassing. He’s well into his 70s. He wore an old sweater with a hole in the sleeve. AGU works on military schedule but he wasted about 10 minutes trying to set up his acetate. It’s hard to tell what his presentation was about – although at one point he alluded darkly to me saying that “McIntyre should know better”. I don’t think that anyone in the audience could figure out what exactly I should know better.

For the record, here’s the comment I sent to the BBC, via the comments section of their news website:

I complain about your selective bias, which directly undermines your policy of impartiality and balance. You led your main news website this morning with a story about potential Arctic ice melting. This was based on one single presentation at a meeting containing 100s of presentations. To simply pick a story with a scary headline is not ‘science’. The AGU meeting contains scientific debate with presentations from across the spectrum of the ‘global warming’ debate. While many politicians and policy-makers are claiming there is no longer a ‘debate’, as a scientist myself, I can assure you there still is, and the AGU meeting is an example of that. It is not for the BBC to ‘choose’ the truth, but to report on the arguments in a balanced manner.

Re #52: Then as a scientist you ought to have been able to point to some other research contradicting the paper they covered. Absent that, I think I agree with the BBC that the summer Arctic sea ice disappearing by 2040 is pretty important. What would you suggest as having been more important? If you think they should be covering something else, you’d better be prepared to be specific.

I’d say the continued existence of the debate about climate science, and the reports by the presenters showing a warmer MWP, are more important than some climate modeler saying “THE WORLD IS GOING TO END! EVERYBODY PANIC”. The BBC failed to note that the Arctic has been ice-free several times in the Holocene, and the world didn’t end … which shows that neither you nor the BBC reporter have done your homework.

Re #54: I don’t know, Willis, is there even one glaciologist who agrees with Harvey’s conclusions? Given that I’ve never seen that argument made anywhere else I kind of suspect not. Did the ACIA include any reference to it? Also, any idea where Harvey got off to? He never did return as promised. One wonders if he was interpreting those proxies correctly.

Willis, considering for a moment the big picture, if the Arctic was indeed that much warmer than present for several thousand years, why don’t we have a record of substantial melt of Greenland? Very mysterious.

Regardless, please do send the text of that post off to the Beeb. Apparently Reuters (via the Sidney Morning Herald), the New York Times and the Independent need it too. Somehow I suspect that list will be much larger real soon.

I don’t know, Willis, is there even one glaciologist who agrees with Harvey’s conclusions? Given that I’ve never seen that argument made anywhere else I kind of suspect not. Did the ACIA include any reference to it? Also, any idea where Harvey got off to? He never did return as promised. One wonders if he was interpreting those proxies correctly.

Willis, considering for a moment the big picture, if the Arctic was indeed that much warmer than present for several thousand years, why don’t we have a record of substantial melt of Greenland? Very mysterious.

Re: Steve Bloom (53 & 56). The fact that these different media outlets are repeating a very similar story suggest to me they are working from the same press release. The BBC should perhaps consider whether its getting its money’s worth in sending their science reporter to San Francisco when he could regurgitate press releases from home (said with tongue in cheek).

You requested I submit references. My comment to the BBC was not against that particular paper, but against the picking of a single one-sided presentation out of many because it was good for a scary headline. Also, this is a discussion board, not a journal. We are all familiar with the same arguments and papers. The difference is our interpretation of them.

You ask what is of more importance than the findings of a computer model predicting loss of Arctic sea ice by 2040. First, this is nothing new with links adjacent to the article to very similar stories in Sept 04 and Sept 05 http://news.bbc.co.uk/2/hi/science/nature/6171053.stm . In the meantime, the BBC has already found more critical matters in Iraq to lead on. Apart from that, there’s the facts that millions die every year of malaria, malnutrition and poor drinking water, that many thousands die from breathing in pollution. Our understanding of the scale of these problems are based on hard facts and data — not computer models or questionable statistics. We certainly have the capability to address these problems more successfully. It will be a shame if our attention and efforts are diverted to deal with “the biggest theory facing mankind’.

With regard to modeling, as an experienced modeler myself (of 3D groundwater flow and contaminant transport) I know how much model results are influenced by the preconceptions of the modeler. You can spend a long time carefully constructing your model, taking account of all relevant information and assumptions. You then run it. Sometimes it produces a result perhaps 100 times what you expected, or in the opposite direction. Then you realize you must have missed something critical and reassess it. You adjust it until its “fixed’. Other times, it produces a result somewhere near what you expected. Then you are happy. Yes, this is an over-simplified and somewhat cynical view of the process, but it is wrong to suggest modeling results represent a proof. With climate models, if they assume recent warming is caused by CO2 emissions, then it is inevitable they will predict an ongoing temperature rise, and in the case of the Arctic-ice model, ongoing melting.

What I hope we can agree on is that the scientific debate is not over.

Regarding the polar sea ice, it melts back starting in late July and then starts freezing back in September. Right now, it has frozen back to the coasts. The melting happens for only one month of the year, August, and it is frozen for the the other 10 and half months of the year.

Right now, the Arctic circle is in 6 months of darkness. As it has been since the planet formed 4.5 billion years ago. It is -35C at the north pole, 24 hours a day through the long night. There is no melting.

Other than a few unusual periods, the poles have always been frozen solid throughout the history of the planet. The BBC should tell the scientific story, not the hyped story.

UPDATE (AUG 16 2006):
1. A line of the post-processing code (postprocessing.m) was inadvertently truncated when posted. Delete line 16 and insert the following line in its place:

[nyears,nvars]=size(X); gridsnonorm=X.*(repmat(sd(1:1312),nyears,1);

2. We have identified a problem in the version of the code provided here, which leads to a sensitivity of results to the time interval used to center and standardize the data. A corrected version of the code which eliminates this sensitivity will be available at this site in the near future.

Luckily, after seeing how the Team played with MBH code/data, I took a full back up of those directories last summer. Steve, if you don’t have a copy of those directories, please let me know. It is better that they are archieved on more than one hard drive. I think Mark T. might also have the contents of (at least) the jclim2003a directory!?

RE #52, #53 and the Arctic Ice. Melting ice (if it happens) will not kill more people than disease and wars. If BBC wants to to an unbiased report, they can do an article on the tree fossils and petrified trees at Eureka. Then we would know how warm it was. The polar bears will have to adapt as they would haver done before.

Steve has a copy of everything in that directory as well. There'[s absolutely no compulsion upon Rutherford or anyone else to produce any code, still less allow anyone to replicate their results. [snip]

Having had a night to think about it, and in talking to people after the session, it is clear that SteveM is enormously respected. Everyone — including some of the RC folk — recognize the rigor and care of SteveM’s work. I am not sure they all like it — I can imagine it is not pleasant to have one’s technical errors exposed with such clarity — but, as scientists, people here at the AGU meeting have taken notice.

I don’t know where it will lead. This year’s AGU Meeting includes lots of AGW-oriented sessions and events, including the Honorable Al Gore on Thursday. Yet, despite all the hoopla, scientists — i.e. the ones we’ve talked with — are this year expressing a healthy skepticism about all things climate (in one-on-one conversations, anyway).

HEre is a little more detail on the “schmozzle” from my notes, which are a little cryptic, both as I’m not a great note taker, because you get swamped with the barrage of info and because my turn to speak was coming up. In Mann’s presentation, he acknowledged Lee and Zwiers as having notified him of an error in how they did their ridge estimate. According to Mann, standardization over the calibration period performed much worse than standardization over the full period. He said that they had moved on that Truncated Total Least Squares solved the problem.

The nexgt speaker, Smerdon, also discussed the matter and I couldn’t tell whether he was talking about the same issue or a slightly different issue. He said that Rutherford 05 standardized on the instrumental period; while Mann et al 2005 standardized on the full period, which he said implied an unrealistic level of knowledge about the proxy. I got the impression that the impact was in the amplitude.

At the end of the day, I think that all of these methods will still end up as merely calculating a weighting vector. In a network with (say) a few flippable random walks and mainly white/low order red noise – which is a good framework for thinking about these things, there’s no “right” method. My hunch is that some of the methods will give more weight to the random walks and others more weight to the white noise.

Also on Hughes – the only “fact” that he mentioned in his outburst was that MXD was used for dating. I’m not sure what that had to do with the price of eggs or how that showed that 3 different and inconsistent versions of Polar Urals/Yamal were in use.

I have at least some of the code on Rutherford’s site. I’ll be down in the man-cave tonight working on my research proposal so I’ll have a little time to check around to see what I have. What I do remember is that I had to make a few mods here and there for a) readability (it was… ugh) and b) test/debug outputs.
Mark

The immense size of this conference is indicated by the fact that David Lea was presenting Mg/Ca cores at a session exactly overlapping ours and I missed this.

Sejrup presented an interesting high-resolution core from the Nordic Seas P1003 MC/SC with up to 25 m in 7400 years constrained by 80 C14 dates, basal date 7400 C14 years, yielding close to decadal resolution. They studied dO18 in N pachyderma, also constrained sedimentatino rate by tephra fromn 1947 dated to Hekla and 1918 at 15 cm dated to Katia. Reservoir of 605 years used. He argued correlations with South Pole Be10 series (which were closer than to C14 solar proxy — he said that the difference was difference in the solar proxies.) Other high-res cores mentioned MC11; GSC 13/

Richey’s presentation was similar to the online poster. 4.4 Mg/Cs was a “robust” coretop number. 43 cm/kyr. about 12 yr/sample. 400 year reservoir. MWP nearly a degree warmer than coretop; 1000-1400 as warm as the late 20th century; LIA was 2 degrees cooler than modern. REplicated in G ruber white and G ruber pink. Pointed out salinity variations.

I posted earlier on Newton et al and her presentation ws similar. She said that SST during MWP was 0.5-1 deg warmer than at present; LIA ws 0.3-0.6 deg cooler. She said that the LGM/Holocene difference was 3.5-4 deg so that the MWP-LIA fluctuation was a measureable percentage of the LGM-Holocene fluctuation. Reservoir correction was 418 (or maybe 478 years) – my figure isn’t clear, The coretop was about 1850-1890, but her comparandum was actual 1990-1994 foraminiefera values, I liked her discussion of ITCZ north-south movements as being a common explanatory factor in various tropical proxies.

Cuffey said that his presentation was a comment on other people’s work. HE prefaced by saying that there was no dispute about AGW, but that the “anomalous warmth” argument was not a strong one since we don’t have millennial scale information. He showed a slide with CAlifornia medieval mega-drought one of 3 presentations to mention this (me, Biondi). He spoke directly to Mann saying that until the divergence problems was resolved, the “rest of us ill be skeptical” of tree ring based information. It sounds like there’s been a dispute both within the NAS panel and within IPCC WG1 as to how much weight to put on tree ring info. HE said that 1/2 the warming from increased CO2 would happen in 50 years, so that we have not seen most of the impact of in-inventory CO2 increases. HE said that the MWP was demonstrably warmer in Greenland, but at the nearby Agassiz ice cap, Fisher’s melt series shows an uptick in 20th century melt but not in the MWP, but was prfoundly warm in the Holocene Optimum. HE showed the ice man of the Alps, mentioning HOrmes’ diagram of glacial retrates (discussed here). He showed a graphic from Pollisar et al from the VEnezuela (also reviewed at CA in June), but didn’t mention that this showed nonh-existence of the VEenzuela glaciers in the MWP. HE said that Vimeux et al had showed that Andean isotopes reflected Amazon runoff. He discussed Antarctic ice shelf disintegration, which is a new hot button. He said that there was a Medieval Cool Period in the Antarctic – this is based on an unpublished Clow borehole result. The “topoplogy” of the graph looks like athe Dahl-JEnsen curve with compressed dating; it would be interesting to see how muich weight can be placed on this. I asked Hugo Beltrami about ice boreholes; he said that because glaciers moved, this created additional problems for ice boreholes. I wonder whether a mis-estimate of glacier flow rates could compress the scaling. (Tho I remain baffled unconvinced that boreholes are meauring anything meaningful; though the propoenents are very nice people and very earnest.)

Biondi’s presentation was pretty similar to his online version. HE had some nice visual effects in his PPT, inserting videos.

Gajewski gave an interesting presentation on high-resolution lake sediments; extracting pollen diagrams. DFAta is at lpc.uottawa.ca. He argued that higher resolution results coud be extracted from lake sediments than people thought; it was just a matter of more sampling. He presented results from the PC2. I suspect that he didn’t quite realize that this particular audience had heard more about principal components than they wanted to.

RE: #50 – What a load of rubbish. The anomaly has actually incurred multiple zero crossings over the past 3 years. Most recently, it was at zero in September. Winds in the NE Atlantic and Barents Sea have since compressed ice near the ice edge from just NE of Iceland over to near Murmansk. Meanwhile, the advancement of the ice edge in the Bering Sea, which had been rapid from August through early October, slowed to a crawl due to a perisistent ridge over the Kamchatka and Chukchi Peninsulas. According the the Anchorage NWS Ice Desk that freeze ought to pick up rapidly now that the ridge has broken down. Therefore, the so called unprecedented anomaly quoted by these alarmists is nothing more than a temporary weeks long effect of synoptics. Amazing (but not really) how so called scientists play upon the general public’s complete ignorance of sea ice dynamics by making statements based on observations of things that only ring true for a few weeks later to be replaced by completely different conditions.

RE: #59 – RE: Obsession with minor quality of life issues in the Northern Hemisphere idustrialized countries meanwhile ignoring the plight of the poor in other parts of the world.

If I may be so bold, there are actually people in the Western industrial countries who have the following agenda:
1) Acceleration of the already amazing fall off in reproduction by the native born in industrialized countries
2) Massive ongoing die off, reduction in average life span and fertility among those resident in the developing world
3) Widespread adoption of “right to die” and eventually, “directed euthanasia” of those deemed “inconvenient.”

Margaret Sanger’s imprint is obvious. The eventual goal is about 100 Million world population, with massive areas of untamed wilderness. A wealthy, mostly white, group would “manage” this New Age world.

RE: #70 – A couple of ice updates. The winds have relaxed in the NE Atlantic / Barents and the freeze there has taken over, the ice is spreading out from the compressed edge. The Bering Strait is nearly frozen across, once that occurs, there will be dramatic acceleration of the ice edge southward. Also, the Cook Inlet started to ice up very early this year – I don’t know if that is counted in the area calculation, but it is an indicator of just how cold the Alaskan landmass has been so far this fall. I cannot rule out another zero crossing for the anomaly, possibly as soon as a couple of weeks from now. In any case, the anomaly is already less than negative 1M Km^2 and moving toward zero rapidly as I write these words.

Willis, considering for a moment the big picture, if the Arctic was indeed that much warmer than present for several thousand years, why don’t we have a record of substantial melt of Greenland? Very mysterious.

Lowest recorded temperature: -70° C (1953 station Northice). Mean annual temperature: -20°C and -30°C: So the Greenland icecap is without any doubt the coldest place in the northern hemisphere, colder than the north pole.

RE: #74 – Interestingly, I had reported that the current anomaly had a magnitude of between zero and minus 1M KM^2. In fact, it is only negative half a million right now, and in a near vertical rise toward zero. The innate variation in ice area during the cold part of the year well exceeds +/- 1M Km^2 – safely, call it +/- 2M Km^2. This is nothing new – Inuit and Vikings have known this for thousands of years. Admundsen knew it from his Viking heritage, gambled, and was able to barely make it through the NW passage, taking 2 years, arriving in San Francisco battered but victorious. That was 100 years ago.

RE: #75 – A relative of mine used to fly to Thule back when the DEW line was being built. That is not even on the ice – the ice front is about 1 mile inland. Still, there is no summer there at least in terms of summer weather we would recognize. It can snow any day of the year. The harbor can ice up at any time. Greenland is a very, very cold place.

RE: #81 – The joys of the Gulf of Alaska Low. When that sucker is sitting at just the right point, the jet steers around it and the Cold fronts are lined up from here to the Chukchi Sea. There is some talk of another possible low elevation snow event later this week.

No way… I LIVE in northern Minnesota (OK, central Minnesota)…and we haven’t had a decent snow in years. Last night, the forecast was for snow. Today, it’s for drizzle.

I’m a bit miffed about this… false advertising if you ask me. I moved here expecting a 3′ (1M) dump at least once a year, closing everything down (I love driving in that stuff). But, nothing…it’s been very lean for snow for the last 7 years.

Re #75: Ferdinand, consider the surface melt extent trend on Greenland with current temps. On the face of it, it is very difficult to imagine thousands of years of a summer sea ice-free Arctic leaving Greenland relatively unperturbed.

#53 — “I think I agree with the BBC that the summer Arctic sea ice disappearing by 2040 is pretty important. What would you suggest as having been more important?”

Summer ice disappearing by 2040 is a GCM projection. Much more important would be propagating the parameter errors through that particular GCM to see the limits of physical accuracy, as opposed to the statistical precision, of the projection.

RE: #87 – As a die hard foul weather skier, I also enjoy it. I see no end to my hobby. Call me naive and overly optimistic, but I truly see no end to it. In fact, I’d probably be within my rights to expect more of it. We live in an extraordinarily benign and warm time, climate wise. It cannot hold.

RE: #88 – The other thing that alarmist malthusians never want to discuss is the vast impacts of wind on ice extent. The wind alone can cause areal reduction in the NH total of over 1M Km^2 in one week’s time. A strong and persistent wind blowing agains the ice edge can move the edge miles a day. Pressure ridges thrust up in the midst of the ice mass. If anything, this “shrinkage” in that it thickens the ice, makes it more impervious to melting.

Also, circumpolar winds have major impacts. These cause the overall ice mass to slowly circulate around the pole. The mass tends to not be circular but ellipsoid. Meanwhile, the Arctic has warmer and colder waters, innately. The areas just east of the Prime Meridian and on either side of the international date line tend to be warmer, especially during the summer. As the ellipsoid slowly rotates, what happens is that the long axis will roughly align with two meridians. During a year where this axis is aligned with 0 and 180, of course the sumnmer melt off is more impactful. A year later, the rotation has brough the long axis into alignment with Canada and Siberia – the Canada pointing lobe will not melt off very much. Once that lobe reaches 180 or 0, it will melt back a lot more. This rotation is very slow. Juxtapose on top of all this the AMO.

Thesis: The expected maximum variation in ice area versus the millenneal mean value for any specific day in the year, is +/- 2M KM^2.

Ok- I get it now. So, as an avid corn snow / glacier skier and glisadier, my first “dumb” question is, what is the operational definition of “melt?”

This is no joke. On the same area of snow / ice, on any given day, I might experience crusty ice, hard ice, packed powder, corn snow and slush. The air temperature might be 20 deg F but the wind still and the sun out, hence “spring conditions” and all attendent “meltiness” – at the other extreme I might be out there on Memorial Day with it 39 deg F, but with clouds and a north wind, in a shady area, and things are as firm as mid winter. How can this “melt” even be measured? Do the people who wrote this paper even have direct experience with the characteristics of the ice/snow – air interface? I suspect not.

Steve B., consider that the inland ice is getting thicker each year, but indeed that the edges are melting due to higher recent (past year 2000) summer temperatures.
But current Greenland summer temperatures are not higher than in the 1930-1940’s, before GHGs had an appreciable increase. See here.
Even edge ice melt was probably faster. There are of course no satellite measurements of that time, but the breakup point of the largest Greenland glacier was measured (see here). The retreat of the breakup point was faster (with a measured 75 m decrease in height of the glacier in 5 years time!) in the period 1929-1953 (24 years) than in the period 1953-2003 (50 years)…

From #59 “With regard to modeling, as an experienced modeler myself (of 3D groundwater flow and contaminant transport) I know how much model results are influenced by the preconceptions of the modeler. You can spend a long time carefully constructing your model, taking account of all relevant information and assumptions. You then run it. Sometimes it produces a result perhaps 100 times what you expected, or in the opposite direction. Then you realize you must have missed something critical and reassess it. You adjust it until its “fixed’.”

I have had much the same experience with geologic models for thermal maturity of petroleum source rocks. Any desired answer can be obtained. I only use the models to test input data. If one can get the preconceived answer without resorting to unlikely parameters and input, the concept has merit.

I would love to see Steve aim his mathematical abilities at the models.

irrespective of what the media report, the real import of this meeting is the opportunity for so many scientists to see Steve’s presentation for themselves, to see the work of others that confirms what he has been saying and then to witness first the efforts of the Team to circle the wagons and then the Gore hype. Most scientists are deeply offended by deceit, imprecision and by deep intransigence, especially when new, improved data become available that cause (or should cause) reflection and re-assessment. The arrogance of the Team and the irresistable force within Gore that results in him over-stating and hyping his remarks, will be in stark contrast with Steve’s measured presentation and demenour. The fall out from this conference could be quite significant for the projection of a scientific “consensus” on AGW and its impacts, versus the continued need to improve the science of climate change research and its utility as a policy input.

Gee, Dano, after I hadn’t seen you around here for quite a while, I figured you were thinking up something new to say … welcome back, anyhow. Perhaps you might direct your question to the folks doing coring and fieldwork, ask them why they’re not extending the various series up to the present, I think you might have a misperception about who’s in the audience here, by and large, we’re auditors, not corers and borers …

Re #90: You’re completely misreading those graphics, Hans. Hint: What do the light and dark red colors signify. Further hint: What does the melt season end date used for the graph signify, and what was happening on that date in 2005?

Re #93: Yeah, Konrad’s probably never even been to Greenland. *snork* If you would read the information at the link provided, Steve S., some of your misimpressions might be corrected.

Re #94: Ferdinand, any relatively short-term warming isn’t going to melt Greenland. Whatever temps there were in the ’30s certainly didn’t melt the summer ice, nor has the present warming (yet). The point is that a warming that is sufficient to melt the summer ice and keep it melted for centuries or millenia would have to take a bite of of Greenland large enough to leave a record. That didn’t happen, therefore neither did the devoutly wished for HTM ice-free summer Arctic.

A nice example of a special case of “argumentum ad ignorantium,” is a fallacy of putting the burden of proof on the person who denies or questions the assertion being made.

Don’t address Steve M’s criticism’s of the various multi-proxy reconstructions, instead attempt to discredit him by shifting the burden of proof on him to obtain his own core samples and doing his own field work, rather than auditing the work of others.

#99, Yeah, that’s pretty much all he’s got. Broken fricking record. He just can’t bring himself to actually address the concerns Steve has raised on the Climate Audit blog.

It would be pretty amusing to see him try to hold a real conversation with some of the folks here though. Like shooting fish in a barrel.

Oohh nooo! Is it Dano’s turn in the cycle again? Why can I never be allowed to read in peace? Dano and Steve B. why not go to Digg and do your posting? There are many of your ilk, I mean, like minded thinkers on that site. you’ll like it! Try it!
Cheers!….theoldhogger

Re # 104.
Talking about a site for like minded thinkers; what is this site?

Re# 102.
I read Dano’s comments differently. To advance climate science it is necessary to have people that do the actual work (field work, modelling etc.). If there are people to critise/audit the work of others we wouldn’t get very far.

Bloom will spin anything into a frenzy of gloom! [And his purpose was to devert from the AGU discussion] And he’s alerted his buddy Dano to Climate Audits positive strides these days toward real scientific truth -so they’ve banded together for an attack wave in the name of their personal politics. Sheesh. Don’t bother trying to reason with them, it’s impossible.

I’ve looked at the summary for this report.
To me it looks like these people are looking at a specific geologic region like never before [because of advancement in technology and transportation in the last 60 yrs or so]- and these scientists are just barely if at all beginning to understand it. But heck what do I know?

From the Summery:

There are indications that some components of the physical system may be recovering and returning to the recent climatological norms observed from 1950 to 1980. For instance, the pattern of near-surface temperature anomalies for 2000’€”2005 has been distinctly different from the patterns that characterized the second half of the twentieth century, exhibiting positive (warm) anomalies over the entire Arctic region. Observations from the early spring of 2006 show a pattern more consistent with the two patterns that dominated the twentieth
century, with well-defined regions of warm and cool anomalies.

Ocean salinity and temperature profiles taken at the North Pole and the Beaufort Gyre both indicate that since 2000 the dramatic shifts observed in the 1990s have relaxed toward the pre-1990 climatology. On the land, permafrost temperatures continued to rise within most of the permafrost-affected areas but at a noticeably slower rate than in the 1990s. Changes in the active layer thickness (the relatively thin layer of ground between the surface and permafrost that undergoes seasonal freezing and thawing) are inconsistent. While some of the monitored sites show a slightly increasing trend in the thickness of the active layer, most do not. There also appears to be a destabilization of several known relationships between climate indices and Arctic physical system characteristics. For example, during the period of satellite observations, starting in 1978, a strong correlation between the Arctic Oscillation index and sea ice conditions had been observed. A positive AO, characterized by a cyclonic atmospheric circulation regime, creates conditions that favor a relatively low sea ice extent. This re-
lationship was clearly evident during the strong positive AO pattern that persisted from 1989 to 1995. Since then, the annual averaged AO index has been exhibiting more neutral conditions, which should support a reversal or, at least, a deceleration in the overall rate of reduction in the extent of the ice cover. Instead, 2002’€”2005 has been characterized by an unprecedented series of extreme ice extent minima. The observations highlighted in the report and the mixed tendencies they reveal further illustrate the sensitivity and complexity of the Arctic physical environment. They also support recommendations to maintain and expand efforts to establish a coordinated Arctic observation network, consistently documented by diverse, international activities (e.g., ACIA, 2004, 2005; SEARCH, 2001, 2005; DAMOCLES, 2005). Long-term monitoring of key parameters, coupled with detailed studies of specific processes, will improve the understanding of this region and enable the development of more accurate models and predictions of its future state. The incentive for supporting and achieving these advancements is high, given the relevance of the physical conditions to other key elements of the Arctic environment and global climate system.

#97, Steve. You’re too modest. The HS is one of the three icons of AGW. Once it goes, the models and the surface based temperature measurements become open to serious public doubt and intense questioning.

If you are going to hear Al Gore, when he claims consensus, could you ask him if there is any controlling scientific authority?

Re #112: Paul, was that just a rhetorical point or do you really not understand the context for all of this? I would suggest spending some time with the abstracts over at the AGU fall meeting site to get a sense of how overwhelming the scientific case for AGW has become.

A nice example of a special case of “argumentum ad ignorantium,” is a fallacy of putting the burden of proof on the person who denies or questions the assertion being made.

Yes, it would not be a much better impact to show folks what you’ve been saying.

Who in their right mind would want to waste time collecting their own data to show others how it should be done. Providing some sort of proof and buttress what one has been asserting – pshaw! Fie on ‘t.

Think of how little an impact that would have, backing an assertion by demonstrating the right way to do things.

Sounds like you’re advocating, “Bring the Proxies Up to Date.” Oh wait, that was what Steve M called for long ago. I’m glad you agree with him. Now if only those sticklers with misidentified and/or semi-secret proxy locations would come clean…

Yeah because the boats are smaller and faster, and more easily steered. Wikipedia says the first attempts were “after the Little Ice Age”-and it did get mapped out in spurts “way back when” otherwise these modern sailers would get lost. lol And from those links it is very lucky thing still to make it through in one season.

Capt. Wojciech Jacobson, one of the first Poles, who in the years 1985-1988 successively sailed the Northwest Passage together with Ludomir Maczka in the expedition organized by known explorer Janusz Kurbiel.

hmmm…while quite an adventure, this doesn’t really seem all that unusual.

Who in their right mind would want to waste time collecting their own data to show others how it should be done. Providing some sort of proof and buttress what one has been asserting – pshaw! Fie on “t.

Think of how little an impact that would have, backing an assertion by demonstrating the right way to do things.

I assume you do know that Mann and Juckes do not collect their own data? This whole debate not about data acquisition, it’s about data processing. All we need is an R script.

re 125 – right. In 1985 – 9188, in short summer spurts over 3 years, the first modern adventure yachters managed that feat. It is now being done reasonably regularly in a single season, without icebreaker assistance. Contrary to what Jeff Weffer said in 108 – one does NOT necessarily require “the biggest icebreaker that money can buy” to do so.

“And you would to time your trip perfectly so you hit the straits in the second week of August and make it out by the first week of September. Because it is frozen solid for the rest of the year.”

That is still true according to your links. However, Jeff Weffer might be making a prediction about needing the icebreaker if that’s the snit you are on he might be wrong or might be right- we’ll have to wait for next summer you can’t prove it with your links.

rocks, weffer clearly said such a trip would require “the biggest icebreaker money can buy” AND that it would require perfect timing as well.

He is wrong.

It still isn’t easy – it seems that it is becoming easier over the last couple decades – but private yachts are making that passage unassisted. No icebreaker required. Timing and some luck IS required, but the people attempting it are pulling it off with pretty fair regularity.

Hell, you can book commercial passage on a cruise into the Canadian Archipelago part of the NW passage.

BTW, A minor correction: Willi DeRoos in 1977 was the first modern adventure yachtsman to manage the NW passage. Note the port of call on the NW coast of Greenland

Yeah Lee fine, but you want to get AGW in there for the reason somewhere-thus the over zealous posting about one mistake/prediction he made. You made your point about icebreakers, but the AGW part isn’t settled and the NW passage is more easily sailed because of technology. The navigation is hard -besides the threat of icy waters and the bergs-fog and harsh weather is a problem too no matter what the season. The Titanic was a pleasent ride too up until that one point.

rocks, the NW passage is passable at all because there isn’t as much ice there. Sure, technology helps – but technology isnt making the ice move out of the way, and weffer’s claim was about the ice and a need for a large ice breaker to navigate that passage. I specifically said in my first post that it was hard – now you point out to me that it isn’t easy. Well, duh.

I said nothing about the reason for lack of ice – you first raised the AGW issue in this subthread – I simply pointed out that a claim that an icebreaker is necessary to navigate that passage was wrong. You barreled in with this kind of knee-jerk absurd and largely off-topic disputation, ending with ‘well, yeah, you might be right, but your motivations are wrong.” Gee, I’m crushed.

uhhh, WTF? “there isn’t as much ice there” is exaggeration?
And yes, Canada treats those Canadian territorial waters as Canadian territorial waters. What possible relevance does that have to the point?

The fact the boats get through without an icebreaker means, contrary to what weffer claimed, that one does not need an icebreaker to get through there.

That is what I said, and people seem to be bailing onto an absurd attack on me for pointing this out.

Now it also appears that it is getting easier to do – people are commonly making it in a single season, for example, rather than spending several winters iced in on the way around. Reduced ice is likely at least part of the reason for this, given that we know that the ice season is shortening. But there are also technology enahncements making it easier, so it isn’t simple. None of this, however, is relevant to my initial, simple, and still central point, which is a simple statement that contrary to what weffer said, it is possible now to sail a summer transit of the NW passage without an icebreaker.

I’m quite astonished that this simple point drew so much heat in return – is it that threatening an observation?

DaveB, weffer made a statement that I knew was not correct, and I responded to it – I knew of the adventure sailing expeditions, they are pretty cool, they make the point that one can get through without an icebreaker, so I pointed them out. Are y’all really that desperate to assign hidden motivations to any statement by anyone who ever challenges anyone here – even when all I did was to clarify an incorrect statement?

Now it also appears that it is getting easier to do – people are commonly making it in a single season, for example, rather than spending several winters iced in on the way around.

An alternative explanation is that we now know so much about the NW passage that was not known 100 (or more years ago) that people know how to get through in one season now. In the past, they were pretty much sailing in the dark.

You might also read the first four and last three paragraphs of that article – your link is wrong, BTW – where they talk about not being able to do “Ice Leave” because they couldn’t find sufficient ice. As long as we’re engaging in cherry picking from popular articles.

Sorry you said “there isn’t as much ice there”. Now look at the picture.
And here is the correct link: here

I know what the article says, but I don’t have an agenda and I do not think AGW is causing the loss of ice. When I read it I am not alarmed Lee, so I am done. ” Isn’t much ice there”-not what I see those pictures and you are exaggerating IMHO.

I will point out again, however, that we have incontrovertible evidence that there has been, in at least several of the last 30 years or so, a NWP passageway the entire way through the ice, clear enough that small yachts without helicopter support can manage to transit. Not having ever done ice navigation in a small yacht, I will decline to guess how that compares to your ‘not much ice.’

Earle, I will take a stab at picking the top five AGW arguments, first I would like to say I am myself very skeptical of AGW and am posting only to see other peoples responses to the lists.
(ie: I will play the devils advocate temporarily)

What I would call the standard persons list:
1) All climate scientist agree that the earth is anthropogenically warming.
2) Recent extreme weather events indicate that something is going wrong.
3) All computor models indicate that an increase in CO2 = increase in temperature.
4) All current information indicates that the earth has not been this hot for over a 1000 yrs.
5) Leave it to the experts.

My list:
1) The lowered ratio of isotopes (c13/c12) indicates for certain that we have changed the composition of CO2 in the atmosphere from burning fossil fuels which is now at a higher concentration than it should be during the natural cycle.
2) While CO2 does not trigger the temperature it definitely acts as a feedback mechanism, causing the temperature to increase which heats the ocean causing more CO2 to be released. Since we have altered this balance some unpredictable consequences may arise.
3) While the computor models and climate science are still in their infancy they mostly seem to predict (excluding things like volcanoes etc.) that the earth will warm ~3.5 (?) degrees by 2100.
4) Humans have altered the earth quite dramatically, by deforestation, urbanism and industrialisation, while it is impossible to point the finger entirely at us it is also difficult not to take some of the responsibility.

Actually I can only think of 4. Sorry, I have to go teach ESL to Chinese students now.

re #130: “Hell, you can book commercial passage on a cruise into the Canadian Archipelago part of the NW passage.”

Being the skeptic/denier type and fairly frequent reader of this blog, I am concerned that there is at least a chance that the globe might cool down over the next twenty years. That could mean no more commercial Cruises through the NW passage. So I googled “Canadian Archipelago Cruise” to see what I could book.

re #130, again: “rocks, weffer clearly said such a trip would require “the biggest icebreaker money can buy” AND that it would require perfect timing as well.”

Turns out the only thing I can find is $15,000 rides on something that may not be the “biggest icebreaker money can buy”, but is definitely an icebreaker, and indeed appears to require perfect timing as well (August).

Lee, if there is something I missed in the NW passage cruise market, please provide a link. I’m guessing I may need to get the cruise done this year or next. I don’t yet trust the gloom and doom proxies and GCM’s

And I wanted to add, if you looked at the satellite picture of July 25th, which to us is the heighth of the summer, the Beaufort Sea was completely frozen over.

Over the next two weeks, the ice finally melted in the Beaufort. In August, it was somewhat ice-free. By mid-September, it had started freezing back and by late-September, the Beaufort satellite picture would have looked much the same as the July 25th one.

But when the news reports were made about the sea ice in 2006, did you hear that story? Did the NOAA radarsat animations of the sea ice show the Beaufort was completely frozen over into August?

No they didn’t.

These visible satellite images give you the TRUE picture. They are actual pictures after all. They have not been modified by a computer software program written by someone in the NOAA who you cannot trust because they have been “born-again” into the AGW religion.

Interestingly, the visible satellite images (ie true actual pictures) cannot be taken by late September because the northern regions of the Beaufort Sea are already in 24 hours of darkness and visible (actual) pictures don’t work. Given that the northern Beaufort is not getting any sunlight at all EVERY DAY by late-September, it is very cold there to say the least.

So thanks to the born-again AGW’ers, the general public has been fed a complete pack of lies about the polar pack ice.

Noo0neis denying that it is very cold up there. There is, however, a reasonably reliable clearing of ice sufficient to allow yachts to transit, with some care and timing, and with a reasonable expectation of making it.

Jeff, anybody who follows this issue is well aware of the melt pattern and timing for the year. You say the Beaufort Sea was wholly frozen over well into August, but I seem to recall a record-setting polynya just to the north of the area covered by the photo you cited. Tell us about that, please. Also, it’s interesting that you would have such faith in government satellite photos given that they can be faked as easily as anything else.

Steve Bloom. It also risks casual readers thinking you actually agree with Jeff.

I’m not asking anyone to believe me.

I am asking you to believe your own eyes.

Steve B. but I seem to recall a record-setting polynya just to the north of the area covered by the photo you cited.

The NOAA heavily-modified computer-generated radarsat pictures did show a large ice-free area north of this Beaufort Sea region at this time. Perhaps warmer water to the north pushed the ice south and kept the Beaufort frozen well into August.

But that just raises three issues:

First: the polar ice pack dynamics are so complex that it is difficult to say what is going on and, given those dynamics, you are taking big risks assuming the Beaufort will be ice-free at the end of July or the NW passage will be ice-free when you sail your ship through it in August.

Second: the same NOAA radarsat animations of the polynya to the north, did NOT show the Beaufort Sea was completely frozen over on July 25, 2006. In this case, I quit believing the polar sea ice animations of the NOAA.

Third: if the polynya was a freak occurence then you wouldn’t expect to see the Beaufort frozen over on July 25th every year. 2006 was just an accident.

However, here is a satellite picture of both 2005 and 2006 on the same July 25th.

Yup, you guessed it. Frozen over in both years. Both years which showed record melting according to the NOAA

Re #162: Jeff, there’s lots of clear water in the 2005 photo, as the caption on the linked page states. The caption also makes clear why the shore was not clear in the 2006 picture even though sea ice was otherwise at near-record low levels. You need to try to pay more attention to the details. Sometimes they’re important.

I just watched the CNET videos from the AGU conference on the GRACE program. Watkins describes on this video how they precisely measure the distance between the two GRACE satellites to 1 micron. Watkins states that this measurement is performed with K and Ka band radar.

The wavelength of K and Ka band radar varies from between 7.5 and 16.7 mm. Typically, minimum spatial resoution is roughly one wavelength. One micron is between one 7,500th and one 16,700th of a wavelength of K and Ka band radar.

If you look at time resolution, electromagnetic radiation travels one micron in 3.34 femtoseconds. One wavelength of K and Ka band radar requires between 25,000 and 55,500 femtoseconds, depending on the frequency used.

Perhaps someone could explain how it is possible to measure distance to an accuracy that is 4 orders of magnitude smaller than the wavelength of the measurement signal.

RE: #164 – the answer is, those promoting radar as some sort of holy grail are telling lies. The other thing about radar is, measuring a hard, reasonably stable surface is one thing, but measuring a liquid or ice surface that is, in the case of liquid, ripply and wavy, and, in the case of sea ice, irregular, has got to inject further measurement error. Of course, those touting radar are not speaking to you and I, they are speaking the the non technical masses, who hear “radar” and assume it’s a whiz bang, perfect method. A scientist said it, therefore it must be true!😉

Re 164:Perhaps someone could explain how it is possible to measure distance to an accuracy that is 4 orders of magnitude smaller than the wavelength of the measurement signal.

Standard radar measures range based on round-trip travel time of an RF pulse. In GRACE (or more precisely HAIRS — High Accuracy Intersatellite Ranging System) 24 and 32 GHz signals are transmitted between satellites. After the receiver the … linear combination of the sum of the phase measurements at each frequency gives an ionosphere-corrected measurement of the range change between the satellites.Details here.

Don’t be too quick now. It’s entirely possible to be much more accurate than a wavelength or half wavelength. It requires using sensitive interference techniques, and I don’t think it’d be much help in getting rid of the problems of measuring the exact height of a satellite from the ocean surface, but it can be done.

For actual surface height measurements you have to average a great many measurements and that’s tough to do if there’s any systematic biases in the measurements.

RE: Sea Ice – for unbiased info on Sea Ice conditions in the Chukchi, Bering and Beaufort Seas, the Anchorage NWS has an Ice Desk link on their main page that is outstanding. I looked at it nearly daily over the past year. What I found was that even the less AGW biased Cryosphere Today (satellite / computer rendered) images seemed to underreport actual coverage.

On the specific topic of NW Passage conditions, this past summer one would have had quite a challenge getting through. The issue was the area North of Yukon and NWT where the ice persists the longest most years. While the Bering Sea opens up and you will get ice free area expanding out from there, what typically ends up happening is that the ice edge ends up in a wide curve from just east of the US – Canada border to a point NNE of there then slowly up and over toward the Eastern Hemisphere. 2005, this wide “bay” in the ice extended a bit further east and north than it did this year. In the odd conditions of 2004, there was indeed for a short while ice free area all the way over to the Canadian Archipelego, but that is a once in many, many years event. Far more typical would be years like 2006 – in fact, a very typical year in all respects, comparing in terms of extent and overall shape of the ice mass with a number of other previous years. I urge a visit to the NWS Anchorage page.

Current relative clock accuracy for GRACE is below the
100 ps mission requirement as supported by the clock
overlap statistics with typical values of 5-10 ps. Related
accuracy of orbital positions is at the 2-3 cm level and is
supported by independent measurements of position
accuracy using Kband range and SLR. The median Kband
range — the GPS determined range is 1.8 cm and is
sampled every 5 seconds over the entire data set. High
elevation SLR range — GPS determined range are at the
2.5 cm level for GRACEA and the 3.5 cm level for
GRACEB.

Yes current GPS technology has improved accuracy significantly. Carrier-Phase Enhancement (CPGPS) provides accuracy to 20 to 30 cm. This equates to a timing error of 0.7 to 1 nanoseconds. Relative Kinematic Positioning further improves accuracy to 10 cm. This is a timing error of 0.33 nanoseconds. My Garmin iQUE has has DGPS and WAAS. It provided 1 to 3 m accuracy (timing error of 3.3 to 6.7 nanoseconds).

These enhancements do not bring GPS to the accuracy claimed by Watkins of 1 micron, or a timing error of 3.34 femtoseconds.

Re: 172,

Paul,

The GRACE clocks are very accurate at 5 – 10 picoseconds, however this is more then 3 orders of magnitude less precise then would be needed to locate the satellites to 1 micron accuracy.

I don’t think they claim to to know the absolute position of the satellites within 1 micron, only the distance between the satellites (which is measured with the microwave ranging system). I believe the ranging system can probably see a 1 micron difference in distance, but I have my doubts as to the absolute accuracy of the distance measured.

#173, Brooks, you’re correct, the 5-10 picosecond accuracy only works out to 1.5 – 3 mm. Even before I posted above, I couldn’t figure out the micron claim either. It’s probably a misprint. It would require very careful interferometry or a much better clock of the kind that can’t be flown plus a large number of corrections for all sorts of tiny effects.

This discussion neither proves nor refutes Steve S.’s assertion. Steve may still have firsthand evidence supporting his conclusion as to promoters of radar.

Now if you’d like to add something to the discussion about how accurate and precise inter-satellite ranging can be then please chime in. Otherwise you’re doing nothing but reinforcing the image you’ve created for yourself on this forum.

Sadlov is criticising this as if it were radar – he clearly does not know what GRACE is, he conflate it with surface radar, and he tells us that a claim from this project, which is not surface-sensing radar, tells us that the radar scientists are lying.

BTW, GRACE is not GPS either.

Subsequent posts show that one can measure distances to smaller that wavelength distances, using interferometry, and when I point this out to sadlov and wonder whether he will retract the charge of lying, you come barrelling in defending this clearly absurd argument Sadlov made, from what clearly must be ignorance SINCE THIS IS NOT A SURFACE-SENSING RADAR EXPERIMENT, and talk about my image here?.

Y’all are blasting criticisms are experiments WHEN YOU DON’T KNOW WHAT THE FRICKIN EXPERIMENT IS!!!!!!

The link that I had in my post (164) is a video of Michael Watkins, a GRACE Project Scientist, at the AGU meeting. He clearly states that the two GRACE satellites use K and Ka band microwave transmissions to determine their distance from each other to an accuracy “of about one micron.” He then went into detail describing what a micron was. This was not a slip of his tongue.

My question was “how is this possible?” There are two related issues here.

One is how does one measure 1 micrometer with microwaves having a wavelength of 7.5 to 16.7 millimeters. K band’s wavelength is between 7.5 to 15 mm and Ka band is between 7.5 to 16.7 mm. There is about 4 orders of magnitude difference between one micron and shortest wavelength of K or Ka band radar. Even a very sophisticated interference algorithm would have a tough time resolving 1/10,000 of a wavelength.

The other issue is how does one measure the very small time difference that it takes light to traverse one micron. This time is 3.34 femtoseconds or 0.00334 picoseconds. The GRACE clocks are accurate to 5 – 10 ps, so how can they measure a time duration that requires 3 orders of magnitude more accuracy? Remember that you need a clock that is more accurate than 3.34 fs, because you need a signal to noise ratio that is at least 2:1 or 3:1. Therefore, you need a clock accurate to between 1 and 2 fs.

If you are going to measure something with a duration of a few femtoseconds, you need detectors that operates that fast and a processor which can accept data at a very high rate.

As the satellites orbit the Earth, gravity field variations cause minute changes in the distance between the two. These changes are measured with unprecedented accuracy by the instruments aboard GRACE, leading to a more precise rendering of the gravitational field than has ever been possible to date.

Therefore, the accuracy of the gravity maps is directly realted to the measurement accuracy of the distance between the two satellites. I found references to positional accuracy in the range of a few centimeters, but I can not determine how they measure one micron accuracy in the distance between two satellites which are seprated by 220 km.

Brooks – they determine position by first using GPS to get the rough position, and then using radio interferometry to refine that position.

I also looked for an explanation of the precise interferometry procedure, and could not find it. But I know the procedure is working at least at a reasonable level – those videos show the gravitometric effect of monsoon rainfalls, for example. And while also acknowledging that it would be good to see exactly what the procedure is, I also note that one proposing an expensive satellite project dependent on that level of precision would spend some time during funding and planning justifying that it was possible.

I certainly don’t jump,as Sadlov did, directly to using that as an excuse to accuse some other scientists using a different methodology of being liars.

Are you trying a reducto ad absurdum proof of your trollishness? You got support of your position by people, including myself, who normally disagree with you. Does this satisfy you? No! You have to snarkishly jump on Steve Sadlov, asking him to bow and scrape before you. Meanwhile, you’re jumping on Steve M just for presenting his impressions about how Al Gore’s speech went. You might want to consider how this conjunction of situations makes you look. As rocks might say, “watch out for that wayward amb.”

the accuracy of the gravity maps is directly realted to the measurement accuracy of the distance between the two satellites. I found references to positional accuracy in the range of a few centimeters, but I can not determine how they measure one micron accuracy in the distance between two satellites which are seprated by 220 km.

Answer is: you can’t if the interferometry is much less than the wavelength of the EM waves. Since microwaves are in the cm wavelength range, then centimeter accuracy is all you’d get, certainly not to micron accuracy.

You can’t resolve two time separated signals sub-wavelength (since that would require a bandwidth greater than the operating frequency, which is difficult to achieve). But you can measure a single signal source sub-wavelength, and you don’t even need interferometry – you just measure the phase of the signal.

I’m a little doubtful about the 1 micron claim though. It is extraordinarily difficult to achieve this in an RF system. I would expect the phase shift associated with any change in temperature of any of the RF components would induce path length changes of greater than 1 micron, so short of having temperature compensation on every single component (and I’m not convinced by that), I suspect the 1 micron is the number of decimal places in their measurement, rather than the accuracy.

A bandwidth greater than the operating frequency is more than just difficult…

Also, another important point, is the use of phase gives an accurate measurement, but with great ambiguity. They may be able to measure fractions of a wavelength, but will not be able to resolve multiples of wavelength (unless there is some encoding in the waveform, which there does not appear to be). Presumably the “coarse” ambiguity is resolved separately, and then fine variations tracked at the measurement update rate.

You can’t resolve two time separated signals sub-wavelength (since that would require a bandwidth greater than the operating frequency, which is difficult to achieve). But you can measure a single signal source sub-wavelength, and you don’t even need interferometry – you just measure the phase of the signal.

You can measure it subwavelength if you have long enough to make repeated measurements without either satellite moving but since they are both in continuous motion (and as you note, the change of temperature would be a source of error) in a non-uniform gravitational field, there’s no chance of measuring sub-wavelength.

brooks, Sadlov used this to claim that the radar people are lying, and then went into the difficulties inherent in radar mapping of sea surface elevation. This is not a radar mapping experiment – therefore he didn’t know what he was criticizing.

You argued that GPS cant get this level of accuracy. GPS is not what is claimed to give them this level of accuracy, therefore you did not (at least then) know what you were arguing about.

Dardinger, I could give a s**t less about Sadlov. I do care when anyone hurls unfounded accusations that others are lying. I am calling Sadlov on what he said. I’m not asking him to bow and scrape, I’m asking him to be honest enough to withdraw an unfounded bald accusation that the ‘radar scientists’ are lying outright, when the claim in question wasn’t even from those scientists, and the only evidence that the people who did make the claim cant live to what they are claiming is that y’all cant figure out how they are doing it.
i see a couple people here saying that yes, tis plausible perhaps – but others satill sayign no way – and Sadlovs intellectually abhorrent bald unsupported claim of lying is still in play.

I agree that you can measure sub-wavelength, that’s done all the time. The question is how far “sub” you can go. They are saying that they can measure to about a ten-thousandth of a wavelength, which seems quite problematic.

However, from the changes shown in the video due to the rain falling on the earth, it does seem that they’re measuring something … I did have a question about that, though. Since the water they’re measuring is not created or destroyed, where is it when it’s not showing up on their gravity maps?

A curiosity about their animation of the results is that there is no signal of the megatonnes of snow and ice that shifts from pole to pole every year. In fact, there’s very little signal of snow and ice at all, which seems very strange – liquid water flows and evaporates away, while solid water hangs around for months. Why is there no high latitude signal in their results?

Also, there is no signal of the wind-driven seasonal change in the weight of the ocean along the earth’s coastlines. This change is large enough to drive volcanoes, so it’s not a negligible effect and should be visible by GRACE.

However, they may have filtered out those signals to emphasize the annual rainfall changes.

Me, as befits my skeptical nature, I’m withholding judgement on whether they can measure to a 1 micron accuracy until further facts come to light.

Lee, Lee, Lee! You don’t care about him, but you do care about him. If someone you care nothing about says something you think absurd, the wise man ignores what that persons says. He does not waste the time of people he might care something about by involving them in a worthless discussion.

To put it more plainly, if you keep this up I’ll put you in the category of those I care nothing about.

The video clip described the use of K and Ka band microwaves to measure the distance between the GRACE satellites. My question was how does one measure a distance that is one ten thousandth of the wavelength of the EM used?

Whether satellite location is determined by GPS or some other means, it is a matter of accurate timing. I pointed out that measurement of one micron requires femtosecond accuracy. How is this possible with clocks which are only accurate to 5 to 10 picoseconds?

I don’t think the clocks are used for the fine positioning. The key to this would be the phase stability of the frequency source (what they call the “Ultra-Stable Oscillator”). This would usually be described as a ratio. Of course, there are then a whole load of practical issues to consider (e.g. of the sort discussed in 187-189).

“You can think of them as two automobile-sized objects, one of them in Los Angeles and the other in San Diego, and they’re measuring the distance between each other to about the size of a red blood cell,” said Dr Michael Watkins, the Grace project scientist at Nasa’s Jet Propulsion Laboratory.

Now he is saying that the accuracy is closer to 10 microns. Red blood cells are about 8.5 microns in diameter. Link.

OK, GPS uses satellites to provide location information on the earth by, wait for it, measuring the range from the satellite to the receiver. The fundamental question posed by Brooks in #164 was:

Perhaps someone could explain how it is possible to measure distance to an accuracy that is 4 orders of magnitude smaller than the wavelength of the measurement signal.

Whether the frequencies used are radio or microwave the basic ranging methodology of radar is limited to at best around 1/4 of the wavelength.

So, simply measuring the travel-time isn’t going to get you three orders of magnitude improvement in precision. The carrier phase method greatly improves the travel time resolution. That was the point I made earlier. Whether it is used for ranging between satellite and ground observer or ranging between satellite and satellite is irrelevant. And here’s some news: precise inversion of gravimetry measurements require precise altitude information. Hence the discussion about location (GPS, intersatellite ranging, etc) with respect to the gravimetric observations being performed by the GRACE satellites.

Here’s a suggestion Lee. When you find yourself resorting to ALL CAPS or use of asterisk-censored swear words, it is time to reconsider why on earth you are even posting. You are then either way in over your head or obviously banging said head against a brick wall. Guess what, no one else sees the world through your filter. When you recognize the significance of that I believe your contributions to CA will be much more greatly valued.

You either need a clock with femtosecond accuracy, or a very stable frequency source. With a very stable frequency source, I can mix in the analogue domain and then down-convert, so I don’t need fast digital or clock components.

As an example, imagine I have a perfect 32GHz reference, and I’m measuring a 32GHz signal. By mixing these two signals, I will generate a number of mixer products, one which will be DC. If I use a low-pass filter to reject the high frequency components, I then just need to measure the DC level to determine the relative phase of the signals. If the received signal is 90 degrees out of phase, for example, the DC level will be zero.

If I move my unit a 0.001 of a wavelength, so that the received signal takes slightly longer to get there, the phase will change of the incoming signal, and the mixer products will change. (32GHz has a wavelength of 9.4mm, so 0.001 of a wavelength will be around 10 microns), the incoming signal will see a phase change of 0.36 of a degree. This will cause a change in the DC level of sin(0.36 deg)=0.6% of the incoming signal.

(They actually down-convert to something other than DC, since DC can be problematic, but this is more of a practical implementation issue than a theoretical issue)

There are a number of potential problems with this scheme. The first is that the frequency sources will not be exact sources, and their relative phase is unknown. The second is that the frequency sources will drift with time. These are resolved by making the measurement both ways and summing in a way to cancel the errors, and making the measurements at the same time by using the on board clocks. No cancellation process is perfect, and without better specifications (which I haven’t stumbled across yet), it is difficult to judge exactly what rejection this cancellation will achieve.

As mentioned above, whilst drift in the oscillator is accounted for, I’m not clear that other effects (such as path length changes due to temperature. These paths effectively become part of the distance between the two satellites, so I don’t think they will cancel. Unless these paths are exceptionally short, they will change with temperature.

I would also question exactly what point on the satellites are being measured. Presumably they would claim (effectively) the centre of the antenna. I’m not sure how relative orientation variability would feed into this (since the antenna may not be at the centre of gravity, or at the centre of rotation of the satellite).

If you have the two satellites on the same orbit, they are in theory “locked” together and their relative position is near constant. What tricks could one perform by making some assumptions about the ‘fixed’ distance? Does anything come to mind?

Earle,
The GRACE program measures gravity based on the relative movement of the two satellites with respect to each other. The accuracy of the distance measurement is the key to the entire gravity measurement program. Any “tricks” would be unlikely to reduce the error.

What Spence_UK described is a way of obtaining good accuracy for the measurement between two satellites with the right equipment. This will not work for the GPS satellites since they are not set up for such precision ranging. The best GPS, as I said above, is good to a few centimeters. This is will within the capability of the GRACE satellite clocks.

The issue that I asked about is the feasibility of accurately measuring the distance between two satellites to within a few microns as Watkins said at the AGU meeting last week. He said one micron on the video clip and implied roughly 10 microns in the Beeb quote.

– The KBR temperature “has to be controlled to 0.2K” to reduce measurement errors. If that means they maintain the temperature of the KBR at some temperature ± 0.2K, that’s pretty tight.

– The pointing requirements of the antenna/spacecraft are per second.” Not sure of the implications of this, but it would certainly make it easier to do, specifying it as an allowable rate of change of the measured distance.

In combination with the sub-mm intersatellite distance observed by the k-band ranging system (KBR) and the accurate satellite position measured by the onboard GPS receiver, the Earth’s gravity field can be deduced with unprecedented accuracy.

I suppose this answers the question about the use of microwaves for distance measurments

– The pointing requirements of the antenna/spacecraft are less than 1 mrad.

– The KBR “measures the dual one-way range change between both satellites with a precision of about 1 µm per second.” Not sure of the implications of this, but it would certainly make it easier to do, specifying it as an allowable rate of change of the measured distance.

I would like to see the error analysis for the entire system. There are a number of variables which we have discussed in this thread. Each variable has errors associated with it. Distilling all the errors down to

1 micron per second is a whole lot different than one micron. If they use interferometry to measure to say ~1/100th of a wavelength, it will take about 2 minutes for a change in speed of 1 micron/second to be measurable.

During this time, the satellite will have moved some 800 km, which is likely why the areas of extra gravity are so blobby and ill-defined in the video. You’re trading accuracy for horizontal resolution. Seems theoretically possible, although you’re talking about a measurement that is accurate to about one part in 10^12, which is hard to do even on the earth, much less in space. However, it’s hard to argue with results. The GRACE satellite is able to pick out the mountains, continents, and mid-oceanic ridges with good accuracy, so they gotta be doing something right …

Seems theoretically possible, although you’re talking about a measurement that is accurate to about one part in 10^12

Most of this ties in with what I would expect. I found a spec sheet on the USO (ultra stable oscillator) and it is quoted as being stable “on the 1 second to 100 second scale” to approx. 10^-13. This ties in with the estimates. Most of the concerns I highlighted (temperature drift etc.) have clearly been considered in the link provided above by BKC.

This doesn’t surprise me; RF comms, radar and ranging is a “hard” science, although parts of RF (especially around the microwave frequencies and higher) is still a bit of a black art, it is quite possible to do real tests and measurements (unlike in climatology), so the analyses tends to be reliable.

What is likely to be less reliable is the stretching interpretations made based on near-the-noise-limit measurements from the satellite.

Perusing google links provided the version of the GRACE Newsletter. Regarding this accuracy of the KBR ranging,

Figure 7 demonstrates the quality of the range data from the KBR system. The dual frequency (K and Ka band) ranging data from the two satellites are linearly combined to produce a quantity that is free of the time-of-flight or the range changes between the two sensors. Thus this residual quantity is reflective of the noise in tracking system or phase changes that are not desired range change signals; and these residuals are shown to have a standard deviation of a few microns.

leaving behind rev.#2 of the BàÆà⻲ger and Cubasch verification paper at CPD:

Consider, for example, their use of truncated “Total Least Squares” (TLS). This is not an optimal approach to regularizing TLS. Under the assumption of homogeneous relative errors in the standardized data as discussed above, an optimal regularization of TLS leads directly to ridge regression, and not truncated TLS, which is indeed the reason Schneider(2001) employed ridge regression in the RegEM algorithm.

As mentioned in Comment 52, I wrote a complaint to the BBC news website regarding their selective reporting of the meeting. I’m not too convinced by the ‘science’ journalist’s comments, but at least he took the effort to reply:

“Peter
It is indeed a huge meeting with very many presentations – oral and
poster; some of which have been peer reviewed and some which have not.
As a guide on deciding what to cover, I prefer to go with material that
has been seen and approved by the widest number of experts in their
field. That is to say, it is not one AGU poster by one researcher, but
material that is representative of a wider input of expertise.
The material to which you refer was presented to journalists by a group
that represents the largest collection of Arctic researchers in the US –
it is called ARCUS. http://www.arcus.org/ Further, it reflects part of
the new NOAA synthesis document “State of the Arctic 2006”http://www.arctic.noaa.gov/soa2006/
If there was at the AGU an opposing view – built upon an equally large
body of peer-reviewed work put together by a separate but equally
prestigious set of individuals – I’m afraid I missed it.

“I complain about your selective bias, which directly undermines your policy of impartiality and balance. You led your main news website this morning with a story about potential Arctic ice melting. This was based on one single presentation at a meeting containing 100s of presentations. To simply pick a story with a scary headline is not ‘science’. The AGU meeting contains scientific debate with presentations from across the spectrum of the “global warming’ debate. While many politicians and policy-makers are claiming there is no longer a “debate’, as a scientist myself, I can assure you there still is, and the AGU meeting is an example of that. It is not for the BBC to “choose’ the truth, but to report on the arguments in a balanced manner.”