NASA Gavin Schmidt Joins Call for Climate Data Transparency

h/t Dr. Willie Soon – NASA GISS head Gavin Schmidt has voiced his support for EPA director Scott Pruitt’s call for more climate research transparency, though Schmidt is concerned that providing enough data and method to ensure reproducibility will distract scientists from research.

Climate scientists call for more transparency in their field

Scott Waldman, E&E News reporter
Published: Thursday, May 10, 2018

…

Making data available is part of publishing in the modern era, and there needs to be better methods for verifying the results of a study are statistically valid, said Rich Loft, director of the technology development division at the National Center for Atmospheric Research.

“In the age of big data, journal publications which would have been suitable a hundred years ago [are] not suitable anymore because it’s not actually enough information to reproduce the thing, so somehow we have to extend our definition of peer-reviewed into these analyses,” he said.

One of the challenges faced by researchers trying to make their work more transparent is the complexity of dealing with a vast amount of data, said Gavin Schmidt, director of the Goddard Institute for Space Studies at NASA. In addition to storing the data, researchers must make the coding used to synthesize it available, he said. In the science community, reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, he said.

“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments that other people have suggested are interesting, then you’re not exporting something that you thought was interesting,” he said. “So there is a cost to the community, but the benefit is of course understanding how robust particular results are.”

LOL, you mean this comes from the same Gavin that would not sit on the same set with Dr. Spencer at the same time? He is obviously concerned with his longevity in his current position from his past actions. Somebody queue the videos!

He could possibly have heard that science that’s not reproducible, is not really science. But what he says about the cost, is somewhat incoherent.

“In the science community, reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, [dr Gavin Schmidt] said.”

Reproducibility means the results scientists claimed are actually valid and will reproduce when tested. We are not talking about benefitting to the researcher, we talk about if the science holds.

To allow reproducibility, scientists must document their set-up and publish then document. In order to know if their work is ‘robust’, it is a must, not something that ‘comes with a cost’. Be aware the administration that pays your salary, want to know ‘what came with the cost’ you’re talking about. If you’re making shit that doesn’t reproduce, it might come as a surprise to you the administration doesn’t want to cover the costs for that. In fact, if a large-scale bunch of non-science in a Nasa office is found non-reproducible, it kind of suggests the people responsible were trying to fraud, and an investigation should be started. Not that it is easy because many of those have good friends in the administration ready to stop any attempts to enforce transparency.

Hugs … what struck me in that sentence was “benefit to the researcher” …. and then he goes on to say the researcher spends all his time reproducing experiments.

That’s just warped. FIRST, the data methods is provided so that OTHER researchers can reproduce and validate your work, not the original researcher … and SECOND, the validation by other scientist certainly IS a benefit to the researcher, both original and replicating.

Oh like what happen to Spencer and Christy. They did not follow that Global Warming gospel they were out, I wonder how much Schmidt and Hanson had to do with that. Oh by the way the last EPA administrator said openly a Climate Change Denier had no place in the government. Funny how leftist always kill dissent and dissenters both figuratively and literally.

Definitely. If the government is funding a study, it should be available online, with all the supporting data also available. If the study is used as support for a policy, transparency requirements should be rigorous.

I think it will come as a shock to many of the younger researchers that public funding means they really, really work for the public, not their private selves for their private gain in career and fortune. If the public funds an experiment then we want to watch. If I am watching and understand how the experiment works, I am free to interpret the results as to meaning and import.

We don’t spend much time on the philosophy of science but there is a clear difference between the result of an experiment and the interpretation of that result. Interpretation involves the exercise of ‘authority’ in a great many cases.

When fundamental errors are included in the interpretation such as the conceptual error in the feedback calculation pointed out by Monckton, and the comparison of the temperature of a planet with a GHG atmosphere to one with no atmosphere at all, instead of with a non-GHG atmosphere, the correct answer to questions about the impact of CO2 enhancement will evade them.

Show us the inputs and data and the formulas. We understand what a first order approximation is. And a second order. Let us check the work and validate it.

I think for government funded studies, where politicians fingers are so close to the pie, a rule imposed from the director is completely insufficient. The next director could eliminate it or handicap it.
It should be enshrined in the foundational charter or legislation. Much more difficult to change. Our democracies need to begin to put some boundaries on the politicians to keep them from co-opting the apparatus of government for their own corrupt ends.

As a former public employee; (intern to city manager) anything I touched, files, work product, emails, text messages, computer files, with few exceptions (personnel records, ongoing real estate negotiations, litigation) were subject to the Public Records Act requests by anyone at anytime for any reason. Don’t work for the government or take its money if you do not want to be subject to transparency. Especially, if you intend to promote trillions of dollars in policy that impacts hundreds of millions of citizens.

Writing observer: the cost of not showing the methods and data should be ‘no publication’.

When the cost of not following the norms of evidence is ‘no one believes you’, that cost will quickly be absorbed and the necessaries handed over.

At present there is quite a bit of ‘believe me, I am a scientist’. I hold that a scientist knows they will not be believed unless the evidence is physically produced in the court of public opinion.

Suppose a dinosaur expert produced a paper claiming to have found a fossilized humanoid body 86m years old overthrowing a number of hitherto accepted hypotheses about the evolution of hominids. And when asked to see the fossil, he replied, “Why should I show you? You’d just try to find something wrong with my interpretation. Only my friends can see it.”

Such people should be guided to the Story Corner at the local library.

Oh Eric you are being too kind to Gavin Schmidt and his ilk, whom you are kindly referring to as “the Climate Community”. The pathetic excuses he is giving for failing to provide data to allow reproducability are revealing. The purpose of preventing reproducibility was to hide the fact that “climate scientists” have no evidence of human-caused warming, and in fact, don’t know what causes the climate to change even remotely.

The responses from the AAAS, the AGU, and the 5 Journal editors (Nature, Science, PNAS, Cell, PLoS) coming out against the EPA’s adoption of data transparency rules were a behind-the-scenes, coordinated effort.
IOW, Just a few deep-pocketed puppet masters tugging the strings of those puppet editors and association presidents to make them dance and sing a little jig, because the death of EPA secret science spell deep trouble for them.

Why does it spell deep trouble, one might ask?

It spells deep trouble because Crown Jewel for the Watermelons is the EPA’s CO2 Endangerment Finding … and it was likely built on Secret Science.
And when it gets re-opened, that old Secret Science will not be allowed to be considered under those new transparency rules.

Within this Technical Support Document, when you go to Pages 4-5, Box 1.1, one finds this statement regarding the U.S. Climate Change Science Program (CCSP):
CCSP integrated federal research on climate and global change, as sponsored by thirteen federal agencies and overseen by the Office of Science and Technology Policy, the Council on Environmental Quality, the National Economic Council and the Office of Management and Budget and the 21 Synthesis and Assessment Products (SAPs) “that address the highest priorities for U.S. climate change research, observation, and decision support needs. ” :

“… Global Climate Change Impacts in the United States that incorporated all 21 SAPs from the CCSP, as well as the IPCC Fourth Assessment Report. As stated in that report, “This report meets all, Federal requirements associated with the Information Quality Act, including those pertaining to public comment and transparency.”

It might have met the old “transparency rules” (i.e. lack of), but it likely won’t meet the new transparency rules if the Pruitt EPA can re-open an examination of the Endangerment Finding.

And in a certain court challenge to any EPA overturning the CO2 Endangerment Finding, the first thing that will come up to the Court is “What science was used?” If the rules the EPA used/uses doesn’t allow “secret science” then the battle is probably over if the old relied on Secret Science.

P.S. If you really want a good belly-laugh, read pages 2-3, “Section 1(a) Scope and Approach of This Document ” of the above linked Technical Support Document.

joelobryan–Actually some of the Technical Support Document may not be all that secret, maybe snuck in by a real biologist. EPA did have them. Reference is to an IPCC report. Maybe they should require reading assignments.

Box 14.1 Ocean Acidification p134–
“The overall reaction of marine biological carbon cycling and ecosystems to a warm and high-CO2 world is not yet well understood. In addition, the response of marine biota to ocean acidification is not yet clear, both for the physiology of individual organisms and for ecosystem functioning as a whole (Denman et al., 2007). ”

p 7 “Medium confidence — About 5 out of 10 chance ”
I seem to remember something explained about this in a statistics course, must have changed due to expert opinion. They apparently ignore their own use of the word “uncertainty.” Do different ones write different sections?

Does the 2.5 particulate rule report have similar contradictions that would give it away? Also if they reference the very large dust storm that blew across Texas during the height of the 1950s drought. I saw it. Followed later by a very large flood.

Gavin is trying to keep his job-and what little credibility he has left. That said, I’m under the impression that the EPA’s position on transparency is on research going forward and not retrospectively- which is far more important,imho. Am I missing something?

He is acknowledging an obvious fact that gaining confidence in important results comes at a cost.

If one wants to increase confidence, one must attempt to independently reproduce the reported result. If you want to have little confidence (or don’t care about it) in a reported result, then don’t expend the resources needed to reproduce.

Yes, it has a cost to the community, but how fast do you think he’d flip if we came up with studies that show warming isn’t a problem. He’d want lots of reproduction of results. But results don’t mean anything without reproduction. If he’s suggesting that we skip that step, he is anti science.

Mike’s nature trick exposed the trickery going into some studies. I strongly suspect that the secret science of the past has a bunch of that too. I would like them to review all that secret stuff and see if it really meets real scientific standards.

What “secret stuff” do you want reviewed? How do you know it hasn’t been done already?

If you can’t accept that “Mike’s trick” was perfectly legitimate, how are you going to know what is acceptable science, anyway? How much evidence, and what kind of evidence, do you need in order to trust scientists to do their job?

Nice try Kristi. Another of your patented strawMann arguments. Unless Jeff Mitchell happens to be a climatologist, he or I for that matter would not be the ones advancing science by challenging the methodology. But the assumption that either of us would be hopelessly unable to follow the science is equally wrong.

In a healthy scientific debate, proving an alternative theory or enhancing an existing theory is what earns scientists their credentials. There is a competition of ideas and there is no such thing as deciding who is right by a consensus vote. It shouldn’t be go to college and do your post doc work and then you get a license to “do your job”, meaning everybody has to just bow to your superior knowledge and expertise.

With apologies to John Lennon, “All we are asking is Give the scientific method a chance”

I expect to see the same thing from many scientists as the wheels start coming off. They will be looking for exit routes, ways to maintain their credibility, at least those with any sense of shame (excluding the likes of Mann and Hansen for example).

I think it is why the IPCC left in some low climate sensitivity estimates, it gives them a ‘well, we always thought this might happen anyway’ excuse.

REPRODUCABILITY: Scientist delievers the DATA AS USED, and CODE AS RUN, to enable others
to REPRODUCE the CLAIMED RESULT. The results are typically tables, graphs and charts.
You suply your data, your code, and I should be able to get the SAME RESULT using the SAME DATA
and the SAME METHOD.

REPLICATION: I get a sufficiently similar result to you using the same data and different methods. different
data and the same method, or different data and different methods. For example: you test substance A
on one set of test subjects. I choose different subjects, the same substance and the same methods.
I’ve replicated your result not reproduced it exactly.

the aim of reproduceability is QA & building foundational tools and data for other scientists.
for example. When Willis gives me his code I can run it to quickly check that he publsihed the
result of the code, and more importantly I can use his code to do my own EXTENSIONS of his
science.

the aim of replication is to test the robustness of the conclusion to changes in data and methods

Wow thanks. And the caps are a real help. But we know this already. Why not send it to your climate friends, with a little note explaining how not doing these things means your work can be and should be inired?

“Some critics have pointed out that Gavin Schmidt’s friend and colleague Michael Mann never disclosed full details of how he produced his iconic climate hockey stick.”

huh. what is the point of that?

I might point out that Scaffetta, a friend of Anthony’s, refused to give his data to mcintyre.
I might point out that Monkton, a friend of Anthony’s refused to share his code.
to what end?
In fact the only people to deny me code or data, successfully deny me, are skeptics.

heck even Mcintyre refused to share the data he used for Watts 2012 when they published that on the web.

Does any one want to criticize Willis or Anthony because their ‘friends’ did something wrong?

FFS, ya”ll ever hear of the concept of individual responsibility?

Mann is responsible for Mann
gavin is responsible for gavin.

there is no point in holding one guilty for the sins of another.

Enourage open data and open code. demand it of everyone. praise those who come to the right side of the issue and move forward.

Bill:
“If they Gavin and co see an issue with their past or hold the view that this could make more people agree with their view then all power to them.”

Why not just take it at face value and accept that scientists have integrity? This is in the best interest of scientific advancement, it’s not some political ploy. Schmidt has suggested this before, encouraging others to make their models available.

Pruitt, on the other hand, is not acting in the nation’s best interest. There are data that cannot be shared due to confidentiality agreements, and not being able to use these in policy decisions is counterproductive. The scientific community is already handling its problems and doesn’t need the government to step in. When Pruitt is cutting important research programs that are in place and gathering data, he has no business adding bureaucratic red tape that will cost millions to oversee.

Why not just take it at face value and accept that scientists have integrity?

Seriously? In Kristi’s world, Scientists are some kind of race of angels apparently. They do not respond to incentives such as the need to get a grant from political organizations that already know what answer they want. They have never been known to, I don’t know, try to “hide the decline”. They do not care a whit about disincentives like losing their grant money if they give the wrong answer.

Kristi, unlike you, we don’t accept things at face value just because somebody told us to.
That’s why we are called skeptics. As opposed to true believers.
We want you to prove it. Anyone who has to hide their data is proving that they can’t be trusted. We have 40 years of experience with the same people that proves that.

Confidentiality agreements. The second oldest excuse.
If a study can’t be replicated, then it must be regarded as disproven. It doesn’t matter why it can’t be replicated.

“Mr. Mosher knows my email, and has my telephone number, and mailing address, and so far he hasn’t been able to bring himself to communicate his concerns to me directly, but instead chooses these potshots everywhere.”

So, did you ever communicate directly with him Mosh? you have his email, telephone, and mailing address so surely 6 years is plenty of time to dash off an email or make a phone call!

““Mr. Mosher knows my email, and has my telephone number, and mailing address, and so far he hasn’t been able to bring himself to communicate his concerns to me directly, but instead chooses these potshots everywhere.”

So, did you ever communicate directly with him Mosh? you have his email, telephone, and mailing address so surely 6 years is plenty of time to dash off an email or make a phone call!”

yep sent a mail. Even met with him after this.

Its a simple request. For 6 years he has help on to the reclassification of the surface stations.

1. All I need are the class 1 and Class 2. Not all the stations
2. I could even work with a sample ( 30-60) of those
3. I am willing to sign an NDA and never publish any results using that data.

Still no.

So as I predicted 6 years ago.. the paper was published on the web. People still take the results
at face value. the key data will never be shared, yet people will still refer to it

For a similar case se the Gergis paper, also retracted at the same time and subsequently published

please describe the public funds that McIntyre & Monkton (and anyone else you want to reference to support your point) have utilized, to acquire the data that “they wouldn’t share”.

then please go to the produce section (that’s where they sell fruit and veggies) of your local store, pick up an apple, carry to the next aisle and plop it down in the middle of the oranges. see the difference?

No. There is absolutely no difference. This isn’t a matter of who “owns” the data, it’s a matter of the advancement of science. Scientists are part of a community that works together in the search for truth. Easy access to data and code is a fairly new issue primarily because of the internet, although the whole climategate fiasco put it into the public spotlight.

The idea that anyone should have access someone’s data just because they pay taxes makes little sense. That’s like saying, since I pay taxes, I should be able to sleep at the White House.

have to agree strongly with steven mosher on this point. there should be no get out of jail free cards for anyone on any side of the debate. level playing field for all and the science will win out in the end.

“…..reproducibility often consumes a lot of time that doesn’t always have a clear benefit to the individual researcher, other than altruism, he said….”.
Well it does have a clear benefit to those that wish to use the information.
Nothing like knowing something has a sound base Gavin and, is not just opinion, or have you forgotten the Lecture on scientific method 101??

There seem to be many personal attacks on Mr Schmidt in these comments. That isn’t right, fair or even sensible. As reported here, his action demonstrates a respect for correct procedures and may have been done at the cost of vilification by the “never mind the truth, feel the propaganda value” brigade. Credit where credit is due, even from someone with whom you disagree radically

Gavin is a shill and a coward. He uses the term denier to attack skeptics and would not share a TV set with a “denier”. Gavin is being paid with my tax dollars. I’ll call him what I want until he behaves like a professional and a gentleman.

I like the fact that you’ve pointed out that it is not right to attack Dr. Schmidt. At the same time, I think it’s highly unlikely that his actions “may have been done at the cost of vilification by the ‘never mind the truth, feel the propaganda value’ brigade.” I’m not even sure what brigade you mean, or that there is one that says, “never mind the truth” unless it’s the skeptics who deny there is evidence of AGW, and that it’s a problem.

john, i think steve has a fair point. gavin works for the american tax payer. i always laugh when i hear the term government employee when the government are employees themselves. i is an easy mistake to make given the attitudes of those in government the world over.

Perhaps Gavin could start his “transparency” initiative by ordering NOAA to reinstate/ update their graph entitled “DIFFERENCE BETWEEN RAW AND FINAL USHCN DATA”, which NOAA deleted from their website in June 2017 ( here is Wayback-machine archive capture):

Moreover, this deleted graph only showed NOAA/NASA’s raw data tampering up to 1999, and I’d love to see how much heat they’ve added since 2000, especially following the major KARL 2015 “Pause-Buster Fix”…

The fundamental problem with the CAGW sc@m is that it doesn’t follow the rules of the scientific method, which demand scientists to adjust the HYPOTHESIS when empirical data doesn’t support hypothetical projections…. Conversely, CAGW “scientists” adjust the empirical evidence to support their hypothetical projections, which is the definition of junk science..

Good choice. I support your call. Adjustments are a highly controversial action when so much rides on the adjusted result. Mosher says all the adjustments are justified. Fine, then there is absolutely nothing wrong with showing them, with an explanation for each so they can be peer reviewed. Let’s get everyone on the same page with facts and evidence.

Personally I am waiting for evidence that the water vapour feedback factor exceeds 1.05.

Yes. I’ve got a real problem with adjusting past data because of current data, especially past data that were manually recorded. And any stations past data that has been adjusted multiple times is especially suspicious. Once adjusted there should be little reason to do it again.

How about we make every profession equally transparent? Then we can all pick apart the way lawyers and bankers and concierges and plumbers do their jobs. We can say, I own stock in this business, so they have a duty to show me every transaction they make. A totally transparent society, so we can tell if people are doing their jobs correctly. Heck, being a defense lawyer in a criminal trial can’t be that tough; maybe we could show them where they go wrong.

Do you see what i’m getting at? We are not qualified to determine whether scientists are doing their job. Data adjustment is ESSENTIAL in cases where there are systematic biases in the data. It is not simply a matter making a graph and looking for oddities, there are algorithms used to pick up peculiarities in the data, and adjustments are validated afterwards.

People have a very peculiar habit of assuming something how not been examined or reported in climate science, although they don’t ever look for it. That’s fine that they don’t comb through the scientific literature – that’s what scientists are for.

If you skeptics want to question and critique science and play scientist yourselves, fine. But remember that this is why you are known as deniers. You deny the science. You don’t trust scientists. You think you know who they are, as if you can see into their hearts and minds and allows you to make all kinds of assumptions about them – they suffer from “groupthink” or socialist ideology or whatever. You think you would do their job better than they. So you side with the few but vocal scientists who don’t agree about extent or reason for global warming, but are united in their support of the fossil fuel industry. Many are associated with conservative think tanks that were paid by FF. Some scientists were directly paid by FF, some through front organizations. So, um, why is their science better and why are they more trustworthy?

Already said this, but the point has nothing about any of us who may not be scientists. It is that scientists can check each other’s work, even if they don’t agree with each other or maybe don’t even like each other. Not just fat balding middle age guys with a weird circular goatee checking each other’s secret work.

Kristi: No, none of us see what you’re getting at. Please explain it, and don’t leave any thought unexpressed. We’ll soak it all in. Make it really epic, just let it rip, and it’s bound to sway us all.
Better still- don’t.

I feel it rather ridiculous that those working in the scientific field have so corrupted their own sphere by misuse of data and methodology that it takes a political figure to bring about a change in attitude. Kudos to Mr Pruitt who has introduced sanity to the EPA. Let us hope the data and analysis used to determine public policy will in the future be free of any such concerns.

… Schmidt is concerned that providing enough data and method to ensure reproducibility will distract scientists from research.

He’s absolutely right. That will probably happen … one time in ten thousand.

The computer programming equivalent is writing brilliant highly efficient code with no documentation or comments. Worse than useless.

The brilliant cowboy paradigm for computer programming usually just hides hacking. It might work but the programmer can’t explain exactly why it does. The resulting programs break and can’t be maintained, even by the cowboys who originally wrote them. That approach is actually quite bad for productivity.

Similarly, science has a replication crisis. Much of the time, scientists can’t even reproduce their own experimental results.

Science will be much better off when scientists start doing the job right. We also have to get rid of the perverse incentives that reward bad science. link Productivity will improve if for no other reason that garbage will be easier to detect and remove.

On the other hand, one of the biggest time wasters in science is the grant writing process. link Fixing that would provide the time necessary to do the science right.

From a laymans perspective, surely the cost of the computer/data work ought to be included in the initial costings of the project. That in turn ought to be scrutinised and assessed relative to the project objectives

Great link to the perverse incentives by the way. Almost from para. 1 it explains why scrutiny isn’t undertaken. Very bad news for the scientific community.

The other thing is work habits. If you’re organized you avoid a lot of wasted effort. If you do things properly in the first place you won’t have to scramble to make your data and code presentable after the fact.

I’m not sure that climate science has evolved to the point where a lack of reproducible results is it’s biggest flaw. It’s biggest flaw is the sheer amount of subjectivity that infects the field, which is why the IPCC has to rely on “opinion” and “judgment” to “interpret” the research in a way that gets to the critical issues of how much warming is due to man and what the quantifiable impacts will be..Reproducibility presumes objective results that need to be reproduced via experimentation. Almost all of the climate “research” I’ve read forms subjective conclusions based on a whole series of subjective assumptions about data collected, or the best way to analyze and/or “correct” data.

In fact, the very definition of “climate” is nothing but a set of abstract, easily manipulable statistics, with no standardized metrics for quantifying those statistics. Do a Google search and try to lock down the period of time that temperatures, for example, have to be averaged before you get “climate” instead of “weather.” You’ll find a lot of weasel language like “between decades to sometimes centuries” or “typically thirty years.” It’s almost as if the scientists want the flexibility to find the results they want. But in any case, when no one has bothered to define the set of essential statistics that make up “climate” and how to standardize the units by which “climate” is quantified – surely a necessary step to accurately determine how climate is changing – how are you going to “reproduce” climate research?

Real scientists try their best to insulate their opinions from their research when trying to quantify how much influence something has on a system. That’s why we have double blind studies. Climate scientists, however, wallow in their unscientific opinions, illogically bootstrapping their own fictional expertise in the Earth’s climate system to compensate for their inability to produce any kind of objective performance.

The fundamental issue with current Climate science is that the idea (not even a proper hypothesis as other elements of the system and effects are unknown) requires a level of uncertainty not present in the data. So immediately you have to massage the data and hence are performing a priori a hypothetical exercise i.e. mathematical masturbation.

The chestnut I hear is that’s because it’s an observational science. Okay, then get good observations or tell everyone up front you can’t so that we don’t have to listen to you rant about the End of World.

It isn’t a scientific issue anyway; it’s an advocacy issue with shades of Marxism. People don’t like to hear that either but if the shoe fits.

Yes, reproducibility has the cost of slowing down pursuing new experiments. However, it’s a necessary cost to prevent research going down ‘blind alleys’. The only way to speed up scientific research isn’t to limit reproducibility, but to make it quicker and easier to do. That’s best accomplished by making all the data and methods used in all research readily available.

Is this the start of the ‘big climb down’ ? Low sensitivity papers published, and now Schmidt calling for something his ex-boss actively fought. Maybe the wheels have finally come off the wagon, lets hope so.

Slightly off the main track, but still in the general area of scientific ethics and release of data. I have my own opinion on it which I will reserve for the moment, as I would like to see what others here think. (This subject was once a very lively discussion on Baen’s Bar, a science fiction forum.)

Are there cases where perhaps methods should be fully revealed – but the data gathered be ruthlessly suppressed, and not used due to the methods used to gather it?

I have in mind the nutrition experiments performed by “Doctor” Mengele in the concentration camps, and the similar ones performed by the Environmental “Protection” Agency for PM 2.5.

…its good to see how rapidly some members of the climate community are coming to accept that they have to start providing full method and data to back their research results.

Please. Virtue signaling and damage control does not impress me. Saying that you “support” transparency while whining about how hard it is to post raw data (presumably because D-niers would naively analyze the data without first making the necessary “adjustments”?)

Science is much less useful if it cannot be relied upon. The complaint that making science useful comes at a cost strikes me as weak. I could use the word “embarrassing” but I think I’ll go with “beclowning.”

Science can never be settled, or secret. It has to be open for review, debate and acceptance of different interpretation. That way it can only get stronger and be more acceptable to people generally. The more open and frank it is the better for everybody, so long as we are all promoting the truth and not following some political dik tak.. One of the strongest reason why I always fought against the “science” of climate change is because it was closed to scrutiny and to me that smacks of the old freemasons, if it doesn’t stand up to examination then it is not trustworthy. You can not vilify people just because they believe differently, that is draconian and not the least bit intelligent.

I am not sure where the ‘extra cost’ comes from. Let’s say a high powered team of researchers have a copy of the dataset on their own pc, each researcher will be running tests on the data. When they want colleague to see a result they will not expect them to type in a series of commands with a possibility of error. The will send them a script, a file with a few lines of text that performs the data analysis and produces some results or graph. This is most important when researchers are working in different universities or institution.
So, publication time, just upload your data files and script. No cost, no time no problem. Just upload the copy you sent to the journal.
No wonder climate science is in a bad way if trivial acts are deliberately made difficult or opaque.

Steve, you are assuming honest quality work. The things done to data and statistics to get it to give the results they want is messy. “Hiding declines” and the fudging requires much trial and error. I predict a surge of climate scientist retirements and an order of magnitude or two reduction in number of papers published.

I think this requirement should be retroactive with the option to provide the data and code or withdraw the paper published. Pruitt should announce that work that has been published without data and code will not be given any consideration. This might be the key to overturning the CO2 endangerment finding and other laws.

Yep. Follow the money. The journals will be one of the entities hurt the most. Not only in “climate”, but also in many studies of other fields that use “global warming” as a crutch to prove their hypothesis.

It’s the typical bloviating by “experts” who are able to rely on the fact that 95% of the public can’t make a confident judgement about the validity of what they are claiming. Gavin knows that it stinks to high heaven that he wants to hide the data or the methods. He’s a political actor above all. So he “supports” transparency, but like Bill Clinton promising a tax cut that he never planned to deliver, we’ll soon find out that he tried as hard as he has ever tried anything in his life, but it was just TOO HARD. Sorry, he will probably also take FULL RESPONSIBILITY for the failure, and DEEPLY REGRET it. But chances are, the only story anybody hears will be that Gavin Schmidt is leading the drive for transparency. What a guy.

Let’s see, you could post a few gigs of raw data on Google Drive for free, along with your source code that might reach a few 100kb. Now you had to complete the source code in order to analyze the data, and you had to collect the data in order to analyze it. So help me out here people. Where is this complicated? Click, click, done. Furthermore, if you can’t or won’t explain your logic for how you analyzed (or is the right term “tortured”?) the data, then how could anybody independently replicate the results?

They need a course in data management. It would even streamline research (if the study is a legitimate one) having the database and code archived as you go. The real problem they are having is it would be a discipline that would discourage the stats and data iterations and weighting done in search of support for preconceived results and significance – this is messy work.

The sudden steep drop in papers published will be a very direct measure of the abysmal quality of all the stuff cluttering a profusion of journals.

If you do not understand how there are no experiments going on here, go back to the beginning and start over. I went to “open house” at my kid’s middle school, and saw in his science classroom:
the scientific method, in a series of posters on the wall. Thankfully, the teacher got them in the right order.

In this, a fledgling scientist with a hypothesis, and some kind of beaker, tests whether the predictions of the hypothesis emerge, or not, when the fledgling scientist MANIPULATES SOME VARIABLE UNDER STANDARD CONDITIONS.

No one is manipulating any independent variables, or dependent variables. Not Gavin, not Pruitt, no one.

“We” have shown up late to the game. The phenomena have already happened. We are just trying to piece together what happened. This is a historical investigation, not an “experiment.”

Wikipedia hits it out of the park with their opening statement under their “experiment” entry:
An experiment is a procedure carried out to support, refute, or validate a hypothesis. Experiments provide insight into cause-and-effect by demonstrating what outcome occurs when a particular factor is manipulated. Experiments vary greatly in goal and scale, but always rely on repeatable procedure and logical analysis of the results. There also exists natural experimental studies.”

Why can I be a decent climate-science critic even though I never took geology or meteorology? Because I know science.

“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios…”

The benefit of reproducibility is being able to distinguish between what actually is science and what is purported to be science. If John Lennon were alive to write about the state of climate science research and policy impacts it might go something like…

Imagine there’s no pikas,
It isn’t hard to do…
No polar bears or penguins,
Except in a zoo…
Forests turned into wood pellets,
To limit CO2…

And my own two penn’orth…
…as others have observed replication is the larger cost. To do so is a choice it is not mandatory.
In fact it may be free, I can see (so called) citizen scientists even treating it as “fun”.

Saving your data & code is a very modest overhead as is making it available.
The former becomes second nature to anyone outside academia. Indeed without it you or your employer may be liable should something go wrong. It helps to get it right next time.
The latter is usually already there, most educational establishments have web access for stuff such as course notes.

This comes as no surprise. It is becoming more and more evident that much science (medical, social?, climate, etc.) has broken down. One of the main reasons is government funding and acceptance of poor scientific methods. It is more important to get the next round of funding than it is to take the time to make sure your current science is correct. The plethora of statistical software, data manipulation, and computer models rather than physical results has made it easy to obtain made up results and publish them regardless of whether they are real or not. Too many so called scientists are really THEORETICAL climate scientists (just like theoretical physicists) who work on a blackboard but never do the physical experiments to prove their theories. Politicians don’t care as long as they have something in their hands to justify spending money on.

The focus on a global temperature rather than global heat is indicative of how perverted the science has become. Just for fun, play along and assume a reasonable global temperature is developed. Will this really tell you what the amount of global heat truly is? Will it tell you what the ‘climate’ in the Sahara, Andes, or Steppes will be? Climate science has jumped over a large number of basic steps in determining how the earth works, yet want folks to think their hypothesis’ are all correct. Would any self-respecting physicist have accepted there truly being a Higgs boson based solely upon computer models?

He’s going to need quantum computers now to stay ahead of model and data checking resulting from integrity disclosure rules. I predict much more complicated models consuming every more computing hardware budgets in this complexity-for-deception arms race. The IRS computing budget requests will pale in comparison.

As well as the ability to replicate an experiment, the other big quality assurance factor is measurement uncertainty.
The error envelopes that one sees so often on graphs are almost always a partial expression of uncertainty. In a lot of climate work, if the realistic, full uncertainty was estimated and shown, so much data would simply rattle around within those limits that typical exercises like the trends of temperature over times would become meaningless. All within the error bounds.
There are formal structures for treatment of errors, a prominent one being from the Paris-based BIPM, the International Bureau of Weights and Measures. I have never seen a reference to the BIPM in a climate paper, though there must be some mentions that I have missed.
Proper error bounds can be embarrassing. You write your paper, play with your conclusions, then see that the bounds are so large that you have no paper at all.
However, if you are a climate researcher, you often set down what you want to prove, find some data to fit the preconception, then do rote service to a simple error estimation which you might or might not bother to show.
Consequently, a large number of climate papers would not pass peer review if the realistic error bounds were correctly derived and shown. There it a nice feature of proper error bounds. They are of great assistance in separating gold from dross. Hence, many modern authors try to evade them. Geoff.

For example. The hockey stick can be replicated. The problem is that red noise also gives the same result. This shows the method is not responding to proxy temperature. Rather it is a fiction of methodology.

For example. The hockey stick can be replicated. The problem is that red noise also gives the same result. This shows the method is not responding to proxy temperature. Rather it is a fiction of methodology.

“President Barack Obama repeatedly pledged he would run the most transparent administration in the history of the United States during both of his presidential campaigns, but the evidence shows Obama’s administration has not only failed to meet that standard, it has actively worked to conceal important information from the public.”
================
Lots of things need to be opaque, just talk to us like adults.

What can they say, we dont agree with transparency? What they do say is its extra work and it can be difficult to do, therefore Im sure it will be difficult to do accurately, and inaccurate data is worst than no data..

“Reproducibility is not free, it has a cost to the community because if you’re always spending time reproducing scenarios, experiments that other people have suggested are interesting, then you’re not exporting something that you thought was interesting,”
This translates as:
“So if your analysis is wrong, it’s better to move on to something else than to have your errors brought to light.”