Now that you've read Scott's essay, why do you think software sucks? Is software doomed to mostly suck just like everything else? Is it an inherent problem
(ala the Anatomy of Insanity)? Is it because we just don't care enough as consumers, users, managers, developers, testers, etc.? Is it because we don't actually know how to build great software? Is it because we're building software using crappy methodologies, tools, languages, etc.? Is it because our notions of how things work are just plain off? Are we distracted by the shiny bits and miss the crustiness? Are we focused on the right problems to solve? Have we become addicted to mediocrity? Or what?

Where do we draw the line between "sucks" and "good enough"?

How do various asymmetries play out? For example, are we buying based on hope and then crying about reality (or whining about reality not living up to our expectations)? How does our warped sense of time come into play (e.g., where we grossly overestimate what we can do in the short term while horribly underestimating what we can do in the long term)? What about our hyperfocus on short-term returns while missing long term costs? Or the fact that we tend to only notice one or two direct consequences while missing/ignoring not only the unintended consequences but also all of the indirect implications?

> Well, I totally agree that the cat food problem exists.> The pet [sorry] example is the medical industry (at least> in the US).> > However, for software, how would the cat food problem> explain e.g., desktop software? The cat food problem> certainly doesn't (completely) explain the fact that MS> Office utterly dominates the market for an office suite> (even amongst purely home users).

I think it's a monopoly thing tangled with the network effects. And, by the way, isn't it hard to even find a non-business desktop software market compared to what we had fifteen years ago? Aside from games, it seems that, well, there was a boom at one point and now there isn't near as much as there used to be. I don't have any numbers, but that's my sense of it.

'Cat food' does seem to be the issue in a lot of business related software. It does seem that open source and customer facing web apps, both of which don't have 'cat food' problems, do seem to suck less.

I agree, though, that much of the consumer facing desktop software that still exists has problems. They may be getting by on novices and a general sense of lowered expectations we've habituated to.

read through it. didn't find the reason i've seen too often (currently is one of them, alas): code managers think of a codebase as an infinite revenue source, never to be retired. the author might be assuming that only newly e created software counts on the suckage meter. if so, too bad. most software being written is maintenance/update; not making it suck is likely of more importance.

as one wag said of java, corporate code managers love it because it allows for continual accretion of drek. COBOL is loved for quite the same reason.

30 and 40 year old codebases still exist. why don't they get replaced? mostly because they serve small enough niches that no one wants to make the effort to build a modern version. thus, lots of old incompetent software keeps on sucking. and users (the cats) get to eat the food.

odd how so much behaviour can be explained by referencing Machiavellie.

I have a dual background of electrical engineering and computing, the products I am involved in developing often include software and hardware. The most obvoius difference is that in hardware you buy a chip and you have a data sheet on it that explains what it does and goes into details about its limitations and it conforms to standards. Therefore I can mix chips from different manufacturers and assemble a good system.

With software you seem to operate at a much lower level, hand coding classes rather than assembling classes into applications. When you find a class that might be of use you don't get a data sheet (nothing like the detail of a hardware data sheet) and in general it won't interoperate with anyone elses classes. You can't in general take a C# class and mix it with a Java class.

I think the reason for this difference between hardware and software is practicallity and the resulting buisness model. There is no way that you can manufacture your own chips, so you have to buy them. With software you think you can do better than the next person (we all do!) and the opportunity exists to try and do better because it is at least feasible to write your own class. Therefore people roll their own rather than buying in, and this kills the buisness model were people will pay for software components. Same goes for the big boys, when MS wanted a new language they didn't build on Java they wrote C#. This is what everyone in the industry does, they write something that they claim is better than the previous system but is in reality largely the same. If they had built on the previous experiance and remained compatible with it, rather than rewritten then we would have had better software.

There has been some progress in software, people have moved from assembly language to high level, people have moved from unstructured code to OO, people now do more unit testing, and people have moved from ad hoc memory managment to garbage collection. All of this helps and I guess with time people will move over more and more to standard libraries, particularly as the languages include large libraries as standard.

> I think it's a monopoly thing tangled with the network> effects. And, by the way, isn't it hard to even find a> non-business desktop software market compared to what we> had fifteen years ago? Aside from games, it seems that,> well, there was a boom at one point and now there isn't> near as much as there used to be. I don't have any> numbers, but that's my sense of it.

Yes, the MS monopoly shadow is quite large.

Another interesting example is the weird market in Mac OS X applications. Look at all of the third-party applications that have been absorbed/copied into the Apple offerings.

> 'Cat food' does seem to be the issue in a lot of business> related software. It does seem that open source and> customer facing web apps, both of which don't have 'cat> food' problems, do seem to suck less.

Do you really think that there is a truly statistical correlation? I.e., I'll concur that there are some examples which clearly suck less. However, as categories, I don't know that I'll go that far -- I've pawed through way too much of those kinds of software that suck just as bad, if not worse than an awful lot of other software categories.

> I agree, though, that much of the consumer facing desktop> software that still exists has problems. They may be> getting by on novices and a general sense of lowered> expectations we've habituated to.

Ah yes, we've certainly become habituated to the mediocrity. Do you think that we've actually become addicted to it?

> read through it. didn't find the reason i've seen too> often (currently is one of them, alas): code managers> think of a codebase as an infinite revenue source, never> to be retired.

Indeed.

Is this part and parcel with the notion that programmers are commodities?

Also, is there some clear way to determine when it's economically justified to build a new version rather than continuing to tweak the old version? How does that account for major issues like the risk involved in rewrites as opposed to aggressive refactoring approaches?

> Cem Kaner once mentioned a theory of software suckiness.> > He said that a lot of software has the cat food> problem:> > The quality of cat food is bad because the consumers> aren't the buyers.

I think this is a dodgy analogy. The problem is that it absolves both the developers and the customers of any responsibility and heaps the blame on a mysterious third party "the buyer". Indeed, looking at the various answers, it is apparent the we are in denial the we are delivering a lot of bad software.

A much better analogy (in my opinion) is that the cat food is bad because the producers aren't the consumers.

Software that we produce for ourselves (either individually or as a community) does exactly what we want or we change it. Consumers don't have that luxury, they rely on us to produce and update the software. Blaming the consumer or "buyer" (or even Microsoft, as one response appears to do) for being on the receiving end of bad software we deliver just doesn't stack up.

well, i did my college in Economics, and about the first day: Sunk costs are irrelevant to decision making.

i've always supposed that, because it doesn't look physical and the marginal cost of production is near zero, code managers (those who "own" that 30 year old codebase) prefer to ignore this. most MBAs do (the Econ grad students and the MBA students didn't get along <G>). it's just too easy to burn another CD. they wish to "leverage the software investment". thus we see "web services" wrapping mainframe sequential file batch programs. no, i'm not kidding. the quintessential way to make sucky software.

a homo economicus decision weighs future revenue against cost of investment. my sense is that the hoarders of these old codebases don't think they can do it again in a modern way; there are no second acts in life. they learned program development in a 60s or 70s context, and by gum that's how things should be. well, our competitors have GUI/database systems. i know, let's put an applet front-end on it. now we'll LOOK modern. that's the ticket.

even Windoze itself carries this kind of baggage: QDOS was a control program, which gave coders access to the hardware. early MS-DOS made its living on the ability of games to run really fast because of this. to call it an operating system was a stretch. that legacy continues to this day.

eventually, someone does it right from scratch. rarely is this the owner of the existing codebase; it's just too easy to burn another CD. and anyway, a decent software salesman can sell anything if it has a GUI. <G>

so far as refactoring goes; in the cases i'm talking about, these are MF COBOL suites. not sure how one could refactor batch VSAM routines into event driven java. or if one should.

corporate software is about where the American auto makers were in the mid 1960s. the car folk were intent on "leveraging" their 40s and 50s "technology" in the 60s and 70s. then the Japanese and Europeans, whose technology had been bombed to non-existence and replaced, woodsheded 'em. US auto still hasn't recovered.

there's lots of software from the 70s and 80s which is batch and file oriented. and a push to web-ify and gui-fy it all. so the code managers put lipstick on the pig. kind of like deciding to put bigger tail fins on, rather than building a better car.

summary: sucky (non-retail, corporate) code persists because much of it exists in islands of monopoly, code architected according to 60s and 70s paradigms, now being prettified, but that's all. cat food blended by bean counters, who may be making a rational (if short term) decision.

In addition to all these excellent reasons, another one I see is a lack of respect for or ignorance about the degree of difficulty represented by software engineering. I see way too many developers believing writing software apps is easy - simply because writing code and making it "work" is easy. Unfortunately, this type of code writing has little to do with the task of software app creation and maintenance.

So, developers write easy code, get it working, and think they've done an adequate job. The result is huge messes.

In terms of Unix, Apple has taken a page from Microsoft's "embrace and extend" (and hijack :-) methodology. I.e., while there's a lot of benefit to moving to a BSD-ish layer over Mach in terms of being able to run most Unix programs with not too much pain, Apple's fanatical control and completely proprietary software on top doesn't show any assimilation of Apple by the Unix world, IMO.

Also, recall that Jobs instigated the switch way back when with NeXT after being pushed out of Apple when the suits took over.

> I think this is a dodgy analogy. The problem is that it> absolves both the developers and the customers of any> responsibility and heaps the blame on a mysterious third> party "the buyer". Indeed, looking at the various> answers, it is apparent the we are in denial the we are> delivering a lot of bad software.

Well, if someone takes that as being the only problem then, sure, I'd agree with you. However, I don't think anybody is saying that that's the only problem. The cat food problem certainly contributes to the suckage, though.

> A much better analogy (in my opinion) is that the cat food> is bad because the producers aren't the consumers.

That's just a degenerate case of the cat food problem.

The crux is the fact that the feedback between the producers and consumers is dysfunctional (for whatever reasons).

Making the producers and consumers identical doesn't really solve the problem, it just gets rid of some of the hurdles. For example, look at all of the wars that different developers have over such trivial issues like formatting style, tools, languages, etc.

To be clear, I think that software suckage is a (set of) choices. For a lot of reasons, we as individuals, organizations, communities, industries, etc. choose to support, make, use, buy, condone, incite, etc. sucky software.