There may also be something beyond simple “engineers vs. scientists”
tension behind Bates’ decision to go public with his allegations. Two
former NOAA staffers confirmed to Ars that Tom Karl essentially demoted
John Bates in 2012, when Karl was Director of NOAA’s National Centers
for Environmental Information. Bates had held the title of Supervisory
Meteorologist and Chief of the Remote Sensing Applications Division, but
Karl removed him from that position partly due to a failure to maintain
professionalism with colleagues, assigning him to a position in which
he would no longer supervise other staff. It was apparently no secret
that the demotion did not sit well with Bates.

34 comments:

This comment by Tom Peterson seems to indicate something similar; the difference between scientists, who think scientific advancement is key, and software engineers, who think rigorously tested software is more important.

> software engineers, who think rigorously tested software is more important

Have you met any software engineers?

We certainly have a different attitude to scientists. We actually like our code and we like to make it nice. But, the competent ones understand the need to have a trade-off between functionality, testing and time-to-market.

It rather sounds like Bates was hiding his incompetence behind a need for perfection.

I used to be an engineering authority for a large multinational. This meant I had to make sure we had senior technical personnel and managers follow standards and regulations. I used to meet a lot of objections by personnel who wanted to cut corners, did shoddy housekeeping, wanted to hurry up, etc. I suspect Karl was like one of the black sheep I had to discipline. So the guy really doesn't get my sympathy. He was in a hurry because he wanted the Karlized data set out before Paris. And that's about it.

I was trained mostly as a scientist, but have always worked as a software engineer, and most recently for several years as a core member of a NASA Software Management Office. In over a decade of working on NASA software, I and my teammates always have consciously designed our software design, development, testing, and deployment processes to suit the particular and explicit goals/requirements/desirements of each particular project.

For my desktop software for flight controllers monitoring and managing telemetry from crewed space vehicles (Shuttle, ISS, Orion, etc.), conception to final deployment certified for live use took years, because the risk of loss of human life was required to be kept very low. Despite that long and laborious total process, most of our releases before that final one went through very much abbreviated processes, because the goals of those preliminary releases included getting feedback quickly so we did not waste time on elaborate quality and archiving processes for releases that we were nearly certain would contain bugs that would be found almost immediately by the flight controllers who would try out those releases with fake or old telemetry.

Most scientific projects are like my project's preliminary releases. Part of their quality assurance is publishing the results for other scientists to evaluate and attempt to replicate. Spending a huge amount of time on internal quality processes usually is much less efficient and effective than using that time for public review. Each organization, and ultimately each project team within that organization, must make its own decision about how much process is appropriate. It sounds like Karl et al. made a completely reasonable judgment, whereas Bates never would have survived in my NASA Software Management Office because of his lack of consideration of context.

Great article describing an explanation by Tom Peterson: https://arstechnica.com/science/2017/02/article-names-whistleblower-who-told-congress-that-noaa-manipulated-data/

Bates's background was software for running spacecraft. That could be either "flight" software embedded in the spacecraft, or "ground" software in mission operations. In either case, the software development processes for such software are much more rigorous than are those for software such as climate data analysis! (See my previous post about my experience in NASA.) Purportedly Bates was dead set on using the more rigorous processes despite their inappropriateness.

Bates could possibly be just a perfectionist. A true perfectionist (I use it as a negative term, but do not imply a conscious bad will - something like, irrationally improving things one is concentrated on, to the detriment of other things that one is not, but could and likely should, be focused on).

Hmmm... as opposed to a non-true perfectionist? I guess I would say this is someone who really tries hard to do their best, but can rationally see competing objectives and ration their time accordingly.

Tom Dayton's argument that projects differ in their software requirements ... is exactly right, examples of the general case.

Bates' methodology error is quite familiar:

Inside Bell Labs, projects like Safeguard ABM or big switches or big operational databases had really heavyweight Q/A, source code control systems, etc, etc .. and there were always problems when managers from such were given projects where that didn't make sense (like analysis systems for phone calls or trouble reports (which I managed)). Of course Area 10 (basic research) never did the heavyweight methods.

In 2002 I repeated two software engineering talks I'd used 1977-1983 inside Bell Labs and for ACM National Lectures and other venues. Both of them include the issues of choosing appropriate methodologies. The first was what I used when helping teach the internal BTL software engineering project management course.

Hmm, I notice none of the skeptics have challenged the data in Karl et al. Could that perhaps be because Zeke H and others have shown that it is more consistent with all the other independent measurements? Climate scientists are so unfair and greedy. Once again, they take the position where all of the available evidence supports them.

Here's Potholer doing his thorough; FL and others who like the message and ignore reality would do well to watch and listen before they make blindfolded accusations.https://www.youtube.com/watch?v=kQph_5eZsGs

This is one way to put it: Scott Boulette @AlgoScott

Alt facts peer reviewed by politicians, what could go wrong?

Meanwhile, for a cauld grue, here's politics before people personified:

Peter Thorne just published an excellent explanation of why, what, and how NOAA is updating its sea surface temperature database. Quite nefarious: https://www.carbonbrief.org/guest-post-why-noaa-updates-sea-surface-temperature-record

Rabett Run

Subscribe Rabett Run

The Bunny Trail By Email

Contributors

Eli Rabett

Eli Rabett, a not quite failed professorial techno-bunny who finally handed in the keys and retired from his wanna be research university. The students continue to be naive but great people and the administrators continue to vary day-to-day between homicidal and delusional without Eli's help. Eli notices from recent political developments that this behavior is not limited to administrators. His colleagues retain their curious inability to see the holes that they dig for themselves. Prof. Rabett is thankful that they, or at least some of them occasionally heeded his pointing out the implications of the various enthusiasms that rattle around the department and school. Ms. Rabett is thankful that Prof. Rabett occasionally heeds her pointing out that he is nuts.