Posted
by
timothy
on Sunday July 24, 2011 @02:45PM
from the is-that-how-you-see-things dept.

theodp writes "John D. Cook points out there's a major divide between the way scientists and programmers view the software they write. Scientists see their software as a kind of exoskeleton, an extension of themselves. Programmers, on the other hand, see their software as something they will hand over to someone else, more like building a robot. To a scientist, the software soup's done when they get what they want out of it, while professional programmers give more thought to reproducibility, maintainability, and correctness. So what happens when the twain meet? 'The real tension,' says Cook, 'comes when a piece of research software is suddenly expected to be ready for production. The scientist will say 'the code has already been written' and can't imagine it would take much work, if any, to prepare the software for its new responsibilities. They don't understand how hard it is for an engineer to turn an exoskeleton into a self-sufficient robot.'"

The whole premise is stupid anyway. I've worked with plenty of scientists in national labs that turn out production grade, maintainable code; and programmers who didn't. The core issue is getting people who write code for reuse by others to follow guidelines, regardless of title or profession.

It isn't stupid at all. Lots and lots of scientific software is written by grad students worried about the results, and don't care about the quality of the code itself. Their idea of what is "good code" has no relation to what a programmer who's worked in a production environment would call "good code". And invariably they decide to include libraries from other grad students at other institutions of equal or lesser value. And don't even get me started about documentation...

The bulk of scientific research is done by grad students (or others like them with various kinds of scholarships). The professors whose name is at the end of paper's author list guide and oversee what is done, but don't have time for the daily grind of research. Their main job is to teach and get funding.

PhD students at tier 1 and 2 research universities are basically bottom-rung scientists-in-training (sometimes with UGs below them). For our first year or two we'll take a class or two a term, but the bulk of our time is spent doing research, writing and reading scientific papers, and presenting at conferences. For the last 3-4 years we typically take no classes and spend all our time doing research and teaching. We're professionals who make $20-$30k/yr depending on the location, plus full benefits and tuition waivers for any classes we do take. Expectations of workload are typically higher than entry level positions in industry (50-80 hours/wk, depending on the field and PI), and pay is obviously worse. The postdocs and professors do do some of the research themselves (especially when younger), but for the most part their time is spent directing the general direction of the research and applying for grants to fund it, doing the work (for free) to review and organize journals, and of course teaching. Most of us are aware we won't be going on in academia after the PhD, and I at least am okay with that.

It's nothing like a masters or a undergraduate degree at all. We really aren't students in any meaningful sense of the word given the modern sense of college, aside from the fact that we'll get a degree in time. In Europe there are post-graduate degrees awarded after the PhD, so I guess you could call their postdocs "students" as well.

It's completely different outside STEM, however, with PhD students typically earning little to nothing and sometimes having to pay tuition.

Exactly this! It is not about the education of the people writing the code. It's about the purpose for which it is written. I've done it all.

As a scientist most software that I write is geared to solving the problem at hand, nothing more. Sometimes this can be 10.000 lines of C++ code, at other times a short python script or 10. Each time, the code serves as a sort of automation for something that needs to be done anyway (I could attempt to compute the simulation result myself, by hand on pen and paper, you know... if I don't die of old age first;) ). Often, not a single thought goes into how to make this stuff reusable, robust or more generic. It works on the one machine it is ever going to run on and very likely nowhere else, because it does not matter. What matters is the program output, not the program itself.

As a software developer I have to think differently. Software gets compiled, packaged and deployed elsewhere. It must run out of the box, never crash, give useful error message and recover cleanly if something bad happened. And amidst all this effort, there's a tiny bit of code hidden somewhere doing the actual work. All that matters is that the program behaves correctly, no matter what the concrete output is. I might not even be expected to understand what the output actually means - it's not my primary concern.

I work for a group at NASA. One of our group's tasks is to take scientist-written code and wrap it for distribution to hundreds of remote sites around the world. We try our damnedest to run the code as-is, but fairly often have to modify it to remove stuff like:

* Hard-coded input and output file and directory names* Small and arbitrary length limitations on file pathnames - I've run into buffers that were declared as 53 characters in length, probably because that was what they needed on their system* Larg

A lot of scientific software is run less than 10 times, often only once. It generates the result, end of story (well, go away and understand what you got). There really is no point in extensively recoding for reuse, checking all the consts are const, etc. Documentation of the form 'does X using method Y (numerical recipes page P)' is often enough - i.e. a couple of lines of comments at the top of the file. It doesn't have to look nice, it just has to be correct. And don't even get me started on optimization

You know, every thread we have about HPC/parallel computing this comes up:

"We just need a compiler that magically autoparallelizes well."

Do you have any idea how difficult it is to write parallel code for any non-trivial application? It's hard enough to get the openmp/openmpi "magic parallel" options to work worth a damn when compiling Fortran or C. I've spent _months_ getting about 10 pages of code to not only run CFD/MHD on nVidia GPUs, but actually reach a sizable fraction of theoretical performanc

The problem with that is recognizing what code is going to be reused by others and what isn't.

I'm an aerospace engineer who writes a lot of code (and does so on the taxpayer's dime), and it is a struggle to find the right balance between getting something functional for the immediate task, and recognizing what will be useful for others later. Since its much more difficult to write the second variety (particularly if it needs to be generalized for as-yet unknown tasks,) its just as important to perform some

The whole premise is stupid anyway. I've worked with plenty of scientists in national labs that turn out production grade, maintainable code; and programmers who didn't. The core issue is getting people who write code for reuse by others to follow guidelines, regardless of title or profession.

Because you can point to a few (very few) exceptions does not make the story untrue in the vast majority of cases.

Scientist code is usually a giant JUST-SO story, sufficient to derive the results they need for the task at hand.They either don't have, or avoid putting in data that will crash the program so limit checking is not necessary.Crashes are fine if they do nothing more than leave a trail of breadcrumbs sufficient to find the offending line of code.Output need not be in final form, and any number of repetitive hand manipulations of either the input or the output are fine as long as the researcher does not need to spend more time writing any more elaborate code.

This is perfectly fine. The cabinet maker makes jigs. They are designed for their own shop and no one else has exactly the same saw and exactly the same gluing clamps. When the cabinet maker sells his shop, these jigs become useless. Nobody else knows how to use them.

The scientist who takes the time to do a full fledged, fully documented, maintainable, fail-soft package for analysis of data that is unique to their project and their apparatus is probably not doing very much science, and probably not doing their intended job. That budgets force them into this situation is not unusual.

It happens every day in industry, academics, and research. To hand waive it away by saying you know someone who delivers the full package merely calls into question your own understanding of the meaning of a complete, fully documented, maintainable, transferable, and robust software package.

Scientist code is usually a giant JUST-SO story, sufficient to derive the results they need for the task at hand.They either don't have, or avoid putting in data that will crash the program so limit checking is not necessary.Crashes are fine if they do nothing more than leave a trail of breadcrumbs sufficient to find the offending line of code.

Funny — this could as easily describe how physicists often write mathematics.

In this paper [aps.org] (the paper itself is here [caltech.edu]), Feynman notes that

"The scientist who takes the time to do a full fledged, fully documented, maintainable, fail-soft package for analysis of data that is unique to their project and their apparatus is probably not doing very much science, and probably not doing their intended job."

that's very debatable.

total number of citations can be a big deal in academia and some of the most cited papers in existence are tool papers.

Scientist code is usually a giant JUST-SO story, sufficient to derive the results they need for the task at hand.
They either don't have, or avoid putting in data that will crash the program so limit checking is not necessary.

Welcome to the worlds of in-house, bespoke and embedded software engineering. This issue is not limited to scientists - in every company I have ever worked at, "getting it done" was more important to management than "code quality".

"I've worked with plenty of scientists in national labs that turn out production grade, maintainable code; and programmers who didn't."

Do you know scientists who turn out production grade, maintainable code, even if they weren't specifically asked to? If you're writing a program to solve a specific part of the problem you're working on, and are only going to use it that one time (happens often in scientific fields and engineering), why would you bother to make it neat and maintainable, or even understandabl

To a scientist, their software is simply a tool, a means to an end. Their results and discoveries are what they really care about. When it comes to reproducing scientific results for verification, it is actually advantageous that another group not use existing software. Another research group using the same faulty software, with the same hidden bugs, will likely come to the same incorrect result.

Productization of software is a completely different exercise. You have to make your software work for a larger crowd on a plethora of devices. You actually have to consider how your software fits into the larger product lifecycle. The key difference here is that you have customers that you need to keep happy.

This tells us that academics view of software is incompatible with the commercial world. So all that teaching CS in universities does is train CS graduates to think the same way - that the code is the product. This goes a long way towards explaining why there's so much poorly documented, badly explained and crappily designed stuff out there. Because the people who write it have never been educated in the importance of productising it.

While that shortcoming can be overcome in a commercial organisation, wi

That's a very narrow view. An artist makes painting but also know how to paint is wall white. An artist knows how to draw, but can also sketch to express an idea.

I am an academic. And most of my code is thrown away after use. (Well, it is actually stored, in case I need it later. But I usually don't.) Moreover, most of my code is just there to tell me if a given approach work or does not work. Once I know, I don't care about the code anymore.

A lot of the stuff I see/help write, is for modeling a physical product, the code is simply a way to guess how it will react in the real world within some margin. The code is not the product, it simply helps sell the product. These products might change in such a way that the model needs to be rewritten maybe every 10 years. By then the original writer is gone, 3 versions of microsoft office have come and gone, as have 6-15 versions of the company wide selection software.

Failing to put in the effort that makes code maintainable during its construction incurs a notional "debt" which the software carries with it. Future developers working on the code "pay interest" on this debt in the form of time wasted on understanding and modifying the crappy, undocumented code, or on fixing bugs that wouldn't have been present if the code were better. Sometimes, those future developers may decide to spend time refactoring, building tests or documenting, and those cleanups pay down the "principal" on the "debt". After their cleanup work is done, future work has smaller interest payments (less effort for the same results).

Startups often deliberately decide to incur great amounts of technical debt on the theory that if the revenue starts flowing in they'll have the money to fund paying it down, but if they don't start getting some money the whole company will evaporate.

For scientific research, it's pretty clear that it also makes sense to incur lots of technical debt in most cases, because there's little expectation that the code will be used at all once the research is complete. Even when that's not the case, I think few scientists really know how to create maintainable software, because it normally is the case. I don't see a lot of scientists spending time reading about code craftsmanship, or test-driven design, or patterns and anti-patterns, or... lots of things that at least a sizable minority of full-time software engineers care a lot about.

I guess the bottom line, to me, is that this article is blindingly obvious, and exactly what I'd expect to see, based on rational analyses of the degree of technical debt it makes sense for different organizations to incur.

That's a great post, of the kind that saves me a lot of typing. You covered the first-order considerations brilliantly.

What you missed was technical debt blindness, which has been around since forever. Books I read around the time of the Mythical Man Month talked a lot about maintenance syndrome: that the original development team would be regarded as brilliant for producing working functionality at tremendous speed (undocumented, with no error handling for edge cases), then the first maintenance team would all be fired as underachievers for adding hardly any new functionality in the first year or two.

Turns out it's hard to erect a machine shop over top of adobe mud brick construction without adding some reinforcement to the structure, which usually takes a lot longer than the entire original edifice.

You can instead take a wrecking ball to the first iteration, but this rarely works out as well as hoped. You end up with far more ambitious adobe mud construction built with a whole new generation of unproven tools. At some point you have to bite the bullet and ferment what you began with.

People hide debt blindness behind widely divergent construals of simplicity, where "simple" usually turns out to be a euphemism for any decision that sidesteps paying down debt in the short term.

For professional software engineers, there is one true simplicity to rule them all: generativity and compositionality. Can you build the next layer on top with any hope of having it work and able to support an ongoing stack? For us, it's a long term game of pass the baton. For everyone else (management, scientists) the endgame is to cash out, and take credit elsewhere (e.g. publication biography).

Unfortunately, a citation is not a formal linkage that the compiler either accepts or rejects. By the standards of compositionality, citation is payment in dubious coin. Citation is not falsifiable. Scientists still count their citations even when they come from papers that are full of crap, peer review notwithstanding. For a professional software engineer, when you start instantiating objects from one library inside an abstract expression template library, you come face to face with compositionality in a way that few scientists can even imagine, having weened at the outrage of being improperly cited.

Technical debt blindness on the part of management quickly turns a software engineering shop into a highly non-linear fiasco. We've all seen this.

Somehow this game works out better (for the participants) when played by bankers with leverage debt. But now it's my turn to pass the baton, since that deserves a whole lot more typing and I've done my bit.

This assumes people are very clearly an engineer/programmer OR a scientist. But I would consider most software engineers to be computer scientists as well. Its a fairly nonsense distinction. The analogy to spiderman and doc ock is fun, but ultimately metaphor don't prove anything.

"Programmers need to understand that sometimes a program really only needs to run once, on one set of input, with expert supervision. Scientists need to understand that prototype code may need a complete rewrite before it can be

The point is that it's a one way street. Software engineering is a specialization of engineering science, but most scientists aren't software engineers. A scientist can create the embodiment of an algorithm representing a solution to their problem, but don't think of it in terms of the qualities of reusability, modularity, interface, coupling, cohesion, exception handling, security, data integrity, etc. And they aren't supposed to: they're trained to understand biology, botany, physics, or whatever their

Then these are still just tips that EVERYONE who writes programs should be aware of. Don't over engineer, don't worry too much about efficiency at first, premature optimization is the root of all evil. On the other hand, be aware that readability and maintainability can be very important in code if it is going to be used more than as a one-off, or if it is going to be used by a team.

I work with Monte Carlo code and statistical analysis software. I use CERN's ROOT package for the stats analysis, CERN's GEANT4 for the MC code, and *nix scripting when I need to handle multiple files. Every single piece of code I write is written for a purpose. That purpose is generally to generate data and then analyze it. The only other people who are going to see it? Maybe my supervisor, and, if I'm just in on a contract, maybe the guy who has to work on my code later. But to be blunt, that doesn't matter. All that matters is that I know what's going on.

That being said, sometimes I write software for my own personal use. There, I tend to write more robust code, trying to follow various programming standards. Because I figure, if I write something for myself that turns out to be fairly useful, someone might want to use it, or adapt it. But professionally, all my code needs to do is get out that table or prepare that figure. Is it sloppy? Yes. Does it get the job done? Also yes. Fortunately, not only is my field esoteric, it's also government work, so it's practically a guarantee that my code will never have commercial release.

Based on my experience, the amount of work that I put into creating quality code is dependent on the task at hand. When I know that the script, software will only be used once or twice (prepare the graph, etc.) it's not worth it to put a great amount of work into it to make it usable. In these circumstances I mostly adhere to the Klingon coding rules [smart-words.org]: "A TRUE Klingon warrior does not comment his code!"
Now, it should be noted that, sloppy code means: usability is utter shit and should not be confused with

You can often tell whether someone is "programming as a means to an end (of your own)" versus "programming to build a tool for someone else". For instance, I have experience in the financial industry. Quite a lot of traders see coding as a means to implement their cool new model. Looking at their code, you can often tell. It's as if everything was built to just exactly fulfil the requirement, with no thought to the fact that those requirements might change. But of course, they do change. So you get hacks and workarounds, and cut'n'paste cargo cult code. Kinda like what those Orks in Warhammer 40K might make. And of course the problem with spaghetti code is that if you write it, nobody can ever help you solve problems/improve it. It's the coding equivalent of painting yourself into a corner. There's loads of smart traders out there with an excel spreadsheet that actually is an extension of their personalities (In fact it's their Magnum Opus. Everywhere they go, they try to take this quirky little file with them). Every little hack is something only they can explain (comments, yeah right. Do your body parts have explanatory comments?) and only they can fix if wrong.

On the other hand, you sometimes hire a guy who is a programmer, but knows nothing about the domain. Very good with OO models and that kind, but you have to teach them everything about finance. What's a settlement date, what kinds of options exist, etc. You get what you ask for, because they know how to turn problems into object models, but you have to ask VERY carefully. And teach. Unfortunately, not everyone has time for that, and so you end up with something that still doesn't quite do what it's supposed to.

So you often end up gettings guys who understand the problems, but can't program, programming. And guys who can program, writing the wrong program.

Many of those people who "can't program" actually can program. They simply understand the program's requirements. Maintainable code is not always a requirement since a lot of software written in research labs is intended to be written once and run a handful of times.

It's also worth noting that properly structured code from a programmer's perspective is not always the same as properly structured code from a scientist's perspective. "Turn(ing) problems into object models," may be the last thing that scient

You get what you ask for, because they know how to turn problems into object models, but you have to ask VERY carefully.

As a developer (and one in the financial industry at that), if there was one single thing I could suggest that would have the biggest impact on software development, it would be this:

Enforce a 'Good Requirements' only policy.

This may require a lot of training, and a great deal of rejected requirements for not meeting the standard, and cause great grief to those who think their 15 second explanation of an 80+ man hour feature implementation is sufficient. It

Isn't this short-term thinking endemic to the financial industry in the first place? I mean, programmers joke about unmaintainable code as personal job security -- but hard-charging finance guys would actually act on that.

I assume they understand the incentives of their job, and have the training and/or personality to ruthlessly focus on that. They have bonuses and commissions to collect, yes? They get no additional income from well-written code, or something that assists their replacement next quarter or n

This is hardly unexpected. The code needed to process data from science experiments can be years in the making by one or few persons sculpting it to do the job they need done. It might be a bit much to say that it's throw-away code, but once the paper is out the door it probably won't see much use again.

All of this combined with the fact that the coders are scientists and thus aren't concerned with UI issues and whatnot make it so it may take a lot of manual intervention at various steps to use the softwar

*shrug* different tools for different purposes. I'm a graduate student, and writing good, scaleable, maintainable code would take far more time than I have, and is not what I'm paid to do. I'm paid to produce results. I've worked in industry before building huge infrastructure systems with many other people, and it calls for a completely different product. I'd go so far as to argue different skill sets. But speaking as someone who knows how to write "good" code, it's a waste of time for most academic applic

As a university researcher in applied game development I pretty much work on abstracting and generalizing *finished* software.

I usually do this: I spend between six months and a year building a game according to some technique, framework or new language I am researching. The game is then finished, published and even sold. Then a paper is written describing the technique and its inpact. Lather, rinse, repeat.

This is just anecdotical experience, but in this day and age of shrinking research budgets it is not

In software development things become more and more planned and predicted and tested over the last decades. Something which was more or less an art is becoming a set of established techniques. So software development becomes more and more and engineering task. On the other hand. Software developers and designers are always trying to use new stuff because the problems of today cannot be solved with the technology from 10 years ago.

I was on an software engineering workshop on modeling and domain specific lang

The days of long-lived software are pretty much gone. There are a handful of companies that still maintain the programs they've written a long time ago, but most programs written today are written quickly and dirtily, to spring up one day and fall into oblivion the next. "Apps" are little more than short fads that come and go, easy to implement due to having little functionality, and just as easy to discard for the next one.

I have dealt with a number of HPC programs on a regular basis that are old enough that they still refer to the input data as a "deck". They will never be completely rewritten and even small changes are few and far between because of the nightmare of re-validation. Unfortunately, because they started life as one-off research code, they are also fantastically sensitive to changes in the compiler due to accidentally depending on implementation quirks rather than the standards.

Maybe you overlook the longer lived stuff because it's everywhere and we're used to seeing it. What about the operating systems, languages and utilities your computer are using? your office applications? the major database management systems? What about the webserver software? what about commerce, banking, healthcare, MRP, ERP systems?

This has nothing to do specifically with scientists, this is more about the difference between code you write for your own use versus code you write for others to use. Scientists aren't the only people who write code for their own use!

Conversely, scientists often do write code that needs to be shared, sometimes among large groups. I used to work in the field of experimental high energy physics, which typically have collaborations of hundreds or even thousands of people. Some of the software I worked on w

The issues surrounding transitioning research S/W written by scientists into honest-to-goodness production systems are ones I'm very familiar with.

At my company, a lot of energy has been put into bridging the gap over the years with varying results. I believe that the root cause of the problem is that research S/W is not an end-product; typically for scientists the end-product is a research paper, white paper, proposal paper, etc., for which the S/W is only a tool for getting to the end-product. As soon as the experimental (or proof-of-concept) S/W returns the desired results, the software is considered "done".

In contrast, production S/W is often THE end-product for developers, so a lot more attention is given to robustness, re-usability, etc. All the standard thinking that you want to go into your production S/W.

One big issue for us is that the research S/W is almost always written in Matlab, while the production code is written in C++ and Java. The single largest source of bugs in our systems is porting S/W from Matlab to C++ or Java. (As an aside, please let's not talk about the Matlab 'compiler', nor Octave. -- we've already tried them both, and they're both performance hogs and also create SCM and CM nightmares).

We experimented with requiring that the research S/W be written in C++, but it was a disaster. The scientists couldn't get anything done, and the code was just awful. So, back to Matlab it was.

And, my experience is that people who I have a great deal of respect for, who I consider brilliant in their fields, holding PhD's, etc., have produced the crappiest Matlab code I've ever had the sorrow to read. My favorite instance was the use of these local variable names within a single function of research S/W that was considered "done" (true story):iiiiiiiiiiiiiiiiiiiii

And, of course, little documentation as to the mechanics of the code. And believe me, it gets worse from there. Bear in mind that the code does indeed work for its particular purpose, and may well be ground-breaking in that particular research domain. But "done"? Ready for production? Not without a major porting effort (which is really a re-writing effort). The most mysterious thing to me, though, is that the scientists, for all their intellectual firepower, don't understand that it's a problem.

The solution we've converged on is to require our bizdev to be responsible for funding efforts to rewrite the research code and get it integrated into the product baseline. And, the bizdev types can't proclaim a particular capability "done" (eg., sell it to customers) until they've funded and executed those efforts. It took years of education to get to this point, but things are moving along much better then before.

The solution we've converged on is to require our bizdev to be responsible for funding efforts to rewrite the research code and get it integrated into the product baseline. And, the bizdev types can't proclaim a particular capability "done" (eg., sell it to customers) until they've funded and executed those efforts. It took years of education to get to this point, but things are moving along much better then before.

Have you considered using Fortran? Matlab is much closer to Fortran than C (arrays start at 1 instead of 0, to pick a rather annoying bug), has plenty of libraries available (just like C) and the newer versions of the language standard are not the spaghetti-laden soup they used to be.

I've worked with maths/science types before to integrate their formulas into linux-compatible production code.

It's usually been pretty small formulas (a couple of screens full of code at most), and I don't understand most of it, being a programmer and not a maths guy. Also, we're a pretty small business, and these haven't been under any major deadlines (it fell more under R&D), so I had time to do this properly.

Most scientists (e.g. physicists, chemists, mathematicians, geo-*) solve their problems with formulas. Then they code these formulas in a coding language which is most likely C, Fortran, Algol68 (not really) or Mathlab. While programmers often also only code, software engineers try to design software and have to incorporate different aspects. This is even true when writing software for the sciences. However, the same apply to all the other fields we write software for.

The two things are different, and people who don't know any better equate them.

There's nothing wrong with being a programmer at all, but programming is a subset of Software Engineering.

It's akin to the difference, imho, between a construction worker and an architect. One can be a hack or a craftsmen, but tends to have a smaller overall picture of the where/what/when/why behind decisions that often seem unimportant or superfluous. The other can be inco

I will readily admit I suck as a programmer. I'm not a programmer. But I would say I'm a pretty good software engineer. I can design software to accomplish a task. I've helped design the structure and implementation of software but when it comes to solving the details I will readily admit that I'm an artist not a developer.

Programming is a language. It helps when software engineers/architects can speak 'computer' but it's not necessary.

In my experience, most people who perform the role you describe aren't software engineers, they're technically-minded business analysts, good at extracting requirements and perhaps painting the broadest strokes of an architecture, but need someone who really understands code to refine their architectures and turn them into good designs

"All programmers are optimists. Perhaps this modern sorcery especially attracts those who believe in happy endings and fairy godmothers... But however the selection process works, the result is indisputable: 'This time it will surely run,' or 'I just found the last bug.' So the first false assumption that underlies the scheduling of systems programming is that all will go well, i.e., that each task will take only as long as it 'ought' to take. The pervasiveness of optimism among programmers deserves more th

In all the (big-pharma) shops i worked at, i'd write and test the command-line number cruncher inside (until my boss could get a paper or two out of it) then hand it to "two guys" that would slap on a stunningly restrictive (in terms of functionality) GUI (itself a third party tool set based on Qt) and it'd sell just fine...no sweat [shrug]

"DIY fixers do a hack job of wiring their routers in their home basements to their computers in second floor bedroom. They drill a hole and take the cable clearly marked "indoor use only" outside the home hanging in a lazy lopsided catenary curve up to the bedroom window, take it through the window into the house. The window sash does not close properly and allows bugs to get inside.

Professional electricians on the other hand use flexible drills to make nice access holes, wire the cab

I've experienced almost the same thing, but with engineers instead of scientists. I attributed the engineer's disdain for software quality to a different motivation. Namely, the disparity in status and pay between engineers and programmers. I believe that they felt that spending extra time on making the software readable, maintainable and all those other ables was beneath them.

The proof of the pudding came when I happened to hire an inexperienced guy as a programmer. He was so smart that he soon learne

This problem is not just present in these two domains. You see this dichotomy elsewhere, specifically in IT.

I do IT in a scientific research-oriented organization, having taken over for previous staff members who were very much of the "IT should be done like research" school of thought. The result was that each problem was addressed quickly and without any consideration for the whole. Being as they were working with physical assets and not just software (though there was a lot of that, too), the end result

This is a really common problem in the academy. So common in fact, that one particular academician has come up with a special license, the Community Research Academic Programming License (aka, the CRAPL). It's worth a look and good for a chuckle:

The job of a scientist is to come up with new ideas and test them.
In that job, code is a tool, like a hammer or a mass spectrometer. If the tool works well enough for the job at hand, why on earth would you spend time making it work better ? It is just crazy.
The other problem is that scientists are arrogant (so they think what works for them is ok for others) and non scientists are stupid, as they expect scientists to do their work for them - in this case, production code.
sigh
It is not a scientist job

This is true, and starting to become a problem given the increasing expectation that as much as possible of our scientific work should now be open source. While this is great in theory, in practice, it means I (as a scientist) am under pressure to make my scientific "exoskeleton" code publicly available. I'm not qualified (and don't have the time) to polish it up into a product that is really suitable for distribution, and my employer doesn't have the funds to hire programmers to do this for every piece of

I wish it were as simple as this thread implies. The truth of the matter is that most commercial developers who are paid to worry about maintainability don't understand how to do it much better than their academic counterparts. Managers notice this and put all kinds of process in place to enforce good practice--requirements and design docs that are practically books, compile-time coding standard tests, smoke tests, regression test suites, automated tests and so on and on and on. These do not, however, turn

It's naive to consider either class of software as being sufficient, or either kind of programming to be superior. Like most problems there is a strong management component to assigning resources to each in appropriate scales.

A computer scientist/software engineer delivered a well-phrased summation of half of this discussion during a 5-minute talk at a recent lightning software session at a science meeting. (Note that there are rarely science sessions at software meetings.) A domain scientist/software en

As pointed out by others examples of the reverse can also be found in practice. However I agree with this generalization to be true for coding practices. I also like to add another related generalization. Autodidact developers tend to code quick and dirty (with a lot of experience how the actual code run in daily practice under heavy loads etc), people with a heavy academic background (also depends on the specific university) tend to code slower and cleaner.

If I'm getting this right, scientists view software as nothing more than a specialized calculator.

I can certainly confirm that as an engineer, that's certainly how I view it. Almost literally in most cases.

But I have never deluded myself that our really complicated calculator programs would be appropriate for some sort of deliverable product for general use. In fact I repeatedly try to make that point and people still don't get it. And usually "Software engineers"

I though so. At least in the field of geo-science, it's the geologists who use full software suites to prepare interpretation data. However, it's the geo-science engineers whom use a half dozen obscure tiny programs specifically to crunch the numbers. If I had to make a wild guess, I'm thinking the same hold true for aerospace, automotive, and architectural design.

If I'm getting this right, scientists view software as nothing more than a specialized calculator.

I can certainly confirm that as an engineer, that's certainly how I view it. Almost literally in most cases.

But I have never deluded myself that our really complicated calculator programs would be appropriate for some sort of deliverable product for general use. In fact I repeatedly try to make that point and people still don't get it. And usually "Software engineers" expect a bunch of their absurd rituals to be followed anyway, even if it's never going to go anywhere or be used by anyone but me. So I would contend that the professional programmers and particularly the software process types are the ones who don't grasp the concept.

Brett

Brett

Here's a tool that may be helpful to you in explaining the issue to various people: "Technical Debt [martinfowler.com]". Your quick-n-dirty code that does what you need it to do carries a great deal of technical debt, but the nature of its limited use means the interest payments are small, so there's no value in paying down the debt. Start trying to turn it into a product to be used by many people, however, and you'll quickly find that the interest payments are unbearable, and that it costs a great deal of effort to pay off

You seem to forget that the basic tenet on "technical debt" is "debt". There's not debt if there's no expectancy of having to repay it.

I didn't forget it, I explicitly stated it.

However, you're certainly right that this is an area where the analogy with financial debt breaks down. Another is the fact that quick-n-dirty code in the hands of the original author often appears to have no "interest" cost -- that person can use, modify and enhance the code with great speed and effectiveness.

I think it's worth looking under the covers of the analogy to see why it breaks down. The reason is that technical debt is really just an artifact of o

I'll say it again. There's no debt if there's no intention to repay it (or if, in fact, it is not repayed).

When you incur in a debt you have to repay the principal and the interests. If the code does all that needs to be done, then there's absolute nothing you will have to repay and it is not the case that the interest rate is low but that there is no technical debt *at all*.

Maybe this is your experience, having come from working on applications that serve mom and pop shops, but don't assume that your experience is the same as everyone else's. Mine is the opposite of yours. Most applications are engineered for maintainability and very often, when compromises are made in shipping things out the door, it is often the function points that are left on the floor, rather than shipping function points backed by unmaintainable code.

"It definitely is functional but hardly has any of the features consumers demand."

isn't this expected (or at least not surprising). if I were a NASA engineer, I'd see the program as a tool to help me accomplish the larger task. the more time I spend on tools, the less time I spend on progress to the larger objective. I'd write the program as quickly as I could.. would not care about UI, functionality, usability or anything else, I built it for me to use - as long as the output satisfies my needs, i'd consider the task done and move on.

Not to mention that engineers/scientists writing their own little programs in math languages (Matlab is used a lot around here) or even straight higher level languages are usually very problem-oriented - a few lines of code for each single problem, without much re-use going on unless the problem to be solved is always the same one...

I actually quite like this approach in the simplified, FORTRAN based Matlab - writing a new script is incredibly fast, because everything's kept so simple - but it's just too mu

I'm working on commercializing NASA software and this couldn't be more true. When talking to the inventor they inevitably say "Oh yea the software is done, anyone can write code for this, should be easy to sell." even if it's coded in Fortran,, has no Gui or documentation of any sort. It definitely is functional but hardly has any of the features consumers demand.

You know, php / ruby / scripting-language-du-jur might solve many "Web 2.0" problems (Jquery, MooTools, and all the other JS libraries are seriously cool stuff), but there is a reason there is "still" a lot of scientific coding being done with Fortran (which continues to be developed like most modern programming languages), and other "niche" languages. This is not the forum to educate you on that little but notable fact...

But really, I can't quite decide if your post is a troll or not, with lines like "even if it's coded in Fortran" and "has no Gui" and "hardly has any of the features consumers demand"...

We're discussing the ability to commercialize that software, not to simply use it.

Yes, it works. It may even work quite well. It's another thing to be able to use it outside the original group that created it and another thing entirely to turn it into something that can be used by someone who wants to use the functionality for a completely different task.

I've written code that is for a task that I need to get done for myself. This means I can use any arcane method of entering data that I feel like in any format I feel like. If I know exactly what I will be entering and in what format I will enter it, I don't need to do data validation steps in the code. If I know that it won't hurt something to just control-C out of the script, I won't bother creating a "Quit/Exit" functionality. All those things are not needed if you know your code and what it can and it cannot do. The problem is that if you want to commercialize it, you will be selling it to people who do not know what it can do.

And yes, NASA can write code that can have commercial applications. Of course, perhaps not for the "consumer" that you are thinking of, but businesses and even other research groups can be consumers of a product. They are going to be made up of people who are not going to want to spend their time learning your arcane methods of doing things to make the program work for them. They will want an interface that can allow them to make use of the powerful capabilities of your software without having to either write it themselves or spend a huge amount of time learning it and its quirks.

FORTRAN is great, and I have used it myself for things. I think perhaps that it is off-base to have used that as a slur against these programs. However, FORTRAN is a bit of a niche skill in this day and age. You're not going to be able to leverage the great majority of coder resources today if you insist on using it for the code base, and it will make some things more difficult to manage, like perhaps not a slick GUI, but something that is functional and makes the software easier to learn and work with.

I disagree. I think you miss the point: FORTRAN was and *IS* widely used for the tasks that it excels at. Other languages are *NOT* widely used for the tasks that FORTRAN excels at.

"In this day and age"?

This leads me to believe that you really don't know what type of things FORTRAN (and perhaps other "niche" languages) are used for, and why it makes sense to use them. As well, you seem to be under the impression that development on the language itself stopped years ago. In fact, work on new versions of FORTRAN has never stopped,

Then the scientist developed the code exactly to the point that brings him what he needs. That's good engineering practice, isn't it?

"It's another thing to be able to use it outside the original group that created it and another thing entirely to turn it into something that can be used by someone who wants to use the functionality for a completely different task."

The problem is not that it's their job to accomplish this, it's the scientists saying "but the code is already written, you don't have much work to do" when this is, in fact, not true. The further problem becomes when other people listen to the scientist and get unrealistic expectations of how long it will take for a product, which can be sold, to be ready.

I always like the Numerical recipes quote: Scientists solve next years problem on last years computer. Computer programmers solve last years problem on next years computer.

I've lived on both sides of this divide but mainly on the scientific side. I become apoplectic with software engineers who just don't vest themselves in the science. The perpetually want a set of requirements. And they get upset if a new requirement is added later. I see software as a way to explore a space. Model it. Determine what more modeling is needed. You are constantly trying to do something that usually is beyond what is computationally possible so you have to figure out what approximation is going to work. What has to be done at full scale and what can be done at lower resolution. Mock up stuff.

The engineers who don't see it as a process just are impediments. Scientists want lots of simple things fast then see what is working and add new simple extensions. They don't want to wait 4 months for some delivered code based on specs it took 2 months to write.

You seem to overlook the fact that computer programmers usually have tight schedule and budget constraints enforced by their supervisor or other management they report to, instead of by the customer (the scientist in this case). Get a computer programmer a gig as sweet as a scientist who can take his or her sweet time to do their research and give the programmer that same open ended time frame with decent equipment and no schedule constraints and you would have a much happier, involved, and more responsive

The perpetually want a set of requirements. And they get upset if a new requirement is added later.

I agree that this is a good, if terse, summation of the basic conflict.

Poorly named, most "Computer Scientists" are NOT scientists. There is no application of the Scientific Method to solve unknown problems.
Instead, they are Software Engineers trying to adapt known problems to solution by a versatile tool (the computer). No Science. Just Engineering.

I see software as a way to explore a space. Model it. Determine what more modeling is needed. You are constantly trying to do something that usually is beyond what is computationally possible so you have to figure out what approximation is going to work. What has to be done at full scale and what can be done at lower resolution. Mock up stuff.

This sounds like Science. Very indeterminate. And, not easily estimated. Business managers that employ Software Engineers demand estimates.
And schedules.

Upgrading Computer Scientists to Software Engineer is an equally large leap, and is doing a disservice to other engineering feilds.

The majority of the time, there is simply not enough rigour exercised in the design and development of most software. Many programmers are successful because of their paricular personal flair - something closer to a craftsman / artisan.

We only get upset about people adding new requirements when we have a deadline to work to and adding the new requirement forces us to exceed our deadlines. We're the ones who get the blame, not you for adding a requirement and thus increasing the amount of work that needs to be done. If the deadline was automatically extended when new requirements were added we really wouldn't care.

And there's a world of difference between research code and production code; you don't want engineers to write your research

Agreed, when I wrote HPC code for physicists, I was appalled when I saw the programs those people wrote. They wasted precious cycles doing pointless iterations and recalculating stuff. Don't bring a scientist to an engineer fight.

"I'm working on commercializing NASA software and this couldn't be more true."

I don't think is just NASA. Wasn't an IBM engineer the one that said: X time for a valid prototype; 10*X for an in-house product; 100*X for a commercial packaged application?

Given that what a scientist does for her calculations is in the "valid prototype" stage (why the hell would she be interested in anything else?) you can do the math about how much it will cost to put it on a shelf, so to say.

Programming *is* a science - one that deals with the abstract. The scientists referred to in the article deal with the physical. I'm a software engineer. When people ask me what I do, I say that I design, architect and build things that are not real. The sad thing is that most of it is for companies to find better ways of extracting money from people.