There is a constant refrain that occurs whenever people try to achieve anything

There must be an easier way

We learn this lesson at an early age and never forget it. The toy problems we are “challenged” with while learning always have an easy solution. Sometimes the easy solution is non-obvious and hard to find, but there is always a trick that makes solving the problem easy.

Unfortunately the world does not work this way — but we want to be tricked into thinking that it does.

Some examples:

Finding the one food that will help the pounds melt away

A pill that will cure all diseases

The invisible hand of the market

Buying a CASE tool to improve code quality

Adopting Extreme Programming

Thinking that Requirements Traceability makes systems better

Whether we think of these as “Silver Bullets” or a “Technological Fix”, it seems that we are hardwired to seek out simple solutions. In part this could be because we are so good at pattern recognition that we see a pattern where none exists.

All of this makes progress in software development difficult, because collectively we don’t want to believe how hard it is to deliver reliable systems. There has to be an easier way …

Calculators became affordable in the mid- to late-1970s. Students in the 1980s were taught by teachers who had learned mathematics without calculators, and could do basic mental arithmetic. Students today might be taught by a teacher who is himself unable to work out 37+16 without help. The consequences are neatly described in an “Alex” cartoon I have on my fridge about a proposal to ban the use of calculators in school. “Faced with home work which requires him to work out simple sums in his head today’s lazy seven-year-old will instinctively turn to the quick and easy method of arriving at the answer… i.e. asking his dad, who, embarrassingly also wouldn’t have a clue without a calculator.”

Implications of this could be interesting for software development. When there is a large part of the workforce unable to do simple calculations without the use of a “Guessing Box” I expect there will be a lot more errors in software. Or at least errors that can be attributed to the Garbage In, Garbage Out problem of the users (and developers) not having the basic skills to detect implausible answers from systems.

Not sure what it is about the magazine, but it seem to be incapable of reporting the implications of actions. A stunning example of this comes from their Babbage Blog reporting on the delays in the acceptance of the reports that CO2 is warming the planet…

Erring on the side of extra caution is not a bad idea, and various efforts are underway to develop, corroborate and better to underpin the work on temperature records that has been done to date.

The displacement of the idea that facts and evidence matter by the idea that everything boils down to subjective interests and perspectives is – second only to American political campaigns – the most prominent and pernicious manifestation of anti-intellectualism in our time.
– Larry Laudan, Science and Relativism (1990)

For some years I’ve been troubled by an apparent decline in the standards of intellectual rigor in certain precincts of the American academic humanities. But I’m a mere physicist: if I find myself unable to make head or tail of jouissance and différance, perhaps that just reflects my own inadequacy.

So, to test the prevailing intellectual standards, I decided to try a modest (though admittedly uncontrolled) experiment: Would a leading North American journal of cultural studies – whose editorial collective includes such luminaries as Fredric Jameson and Andrew Ross – publish an article liberally salted with nonsense if (a) it sounded good and (b) it flattered the editors’ ideological preconceptions?

The overall result of the experiment was that the parody article was published as if it were a valid work of scholarship in the field.

Hoax or Expose?

Sometimes it is not enough to just question something, sometimes you have to go further. Yes, Sokal’s experiment is often labelled a hoax, but my take is that it was an expose of many things that are wrong with out current social and political discourse.

Soundbites cannot communicate nuances of ideas

For whatever reason, few people take the time to really find out what is going on in the world, being happy to be few a soundbite by a politicain or demagogue. Since I have the CO2 level shown in the sidebar, a good soundbite to use as an example is “CO2 is Plant Food”. Yes, CO2 is required by photosynthesis in plants, but the role of CO2 is much much more complicated than that.

Television and Radio news rely on soundbites, and as such are destroying public discourse about important matters that as a society we need to deal with. And yes, I know that TV and Radio news have some value, but that value needs to be considered in the light of what it also does to our understanding of science, technology, economics and the political choices facing us in the 21st Century.

There’s a cost to put in place the necessary controls and practices, the checks and balances, and to build the team’s focus and commitment and discipline, and keep this up over time. To build the right culture, the right skills, and the right level of oversight. And there’s a cost to saying no to the customer: to cutting back on features, or asking for more time upfront, or delaying a release because of technical risks.

He goes on to say …

That’s why I am concerned by right-sounding technical demands for zero bug tolerance. This isn’t a technical decision that can be made by developers or testers or project managers…or consultants. It’s bigger than all of them. It’s not just a technical decision – it’s also a business decision.

Alistair Cockburn has pointed this out in his Crystal set of methodologies, that thecost of an error in different kinds of applications varies, and that an appropriate methodology takes this into account.

Life and Safety Critical - things like pacemakers and flight control software

Mission Critical - systems that the business depends on, if the system fails you make the evening news (for example a stock exchange)

Departmental applications - yes the business depends on it, but a certain amount of downtime can be accepted

Comfort - yes, twitter and facebook, I’m looking at you

Processes and verification procedures that are appropriate for a mission critical application would be detrimental to a departmental time booking application. A good test suite would be appropriate for both, but only the mission critical application needs a formal review of the test code to make sure it is covering the appropriate cases.

This is one of the reasons why I support Jim’s idea of being intolerant of the Zero Bug Tolerance mantra.

Wider Intolerance

Some ideas just fail the laugh test, but they are afforded too much deference by people who should know better. The Rugged Software Manifesto fails the laugh test as well.

After reading about the attempts to brand Alberta’s Tar Sands as “Ethical Oil”, maybe it is time to start laughing at politicians that espouse crazy ideas as well.

“I really would urge you to be grossly intolerant,” he said. “We should not tolerate what is potentially something that can seriously undermine our ability to address important problems.”

Beddington also had harsh words for journalists who treat the opinions of non-scientist commentators as being equivalent to the opinions of what he called “properly trained, properly assessed” scientists. “The media see the discussions about really important scientific events as if it’s a bloody football match. It is ridiculous.”

High-performance batteries are very expensive and need to be replaced after typically 700 charges. Here is a simple way to calculate the numbers. The computer battery for my laptop (on which I am writing this) stores 60 watt-hours of electric energy. It can be recharged about 700 times. That means it will deliver a total of 42,000 watt-hours, or 42 kilowatt-hours, before it has to be replaced for $130.

Muller implies in this post that a car would need to replace the battery after only 700 cycles, but although this might have been the case at the time, car battery suppliers are continually extending the life of a battery.

In the world of batteries, what does “end of life” really mean? According to the industry, end of life is defined as that point in time when a battery has lost 20% of its original energy storage capacity or 25% of its peak power capacity. This implies that an EV battery, with an initial range of 100 miles per charge, will reach its end of life when, years later, it only delivers 80 miles per charge.

That time is likely to be reached only after the battery has carried an electric car about 200,000 miles or 2,000 cycles. But that’s not really the end for an EV battery – it’s just the beginning of a second life that not many people know about.

All this goes to show that Muller’s expertise in a branch of physics does not necessarily extend to expertise in economics, engineering and battery chemistry. Muller had tried to dismiss the Tesla roadster before it was released by claiming that it would cost too much and the batteries would be too heavy Confusing Future Presidents part 1. But the Tesla was launched successfully as a niche sports car. Now the GM Volt and Nissan Leaf are also proving that it is possible to build electric cars.

Parallels between Climate Change and Software Development

Things change slowly and then suddenly things are different

An extra 2ppm of CO2 might not seem much, but over a working lifetime, an extra 60-70ppm has changed lots of things. In software development Moore’s Law was slowly making computers faster and cheaper. Suddenly things changed when we realized that hardware was no longer an expensive item in projects. When I started software development, the developer costs on a project were swamped by the cost of the hardware. Now using cloud machines, the hardware costs of a project can be less than the cost of gas to commute to the office.

What we are sure we know is not necessarily true

In climate change, the distinction between weather and worldwide climate is not well understood. Also it is hard to figure out how a 2C change could matter that much when locally the weather can range over 60C from the depths of winter to the height of summer. In software development, historically it really mattered that the code was as efficient as possible because the machines were slow. So everyone knows that scripting languages are not useful for large scale development. Enterprise systems are built in traditional languages, but most web companies are using some form of a scripting language, or one of the newer functional languages.

Fear, Uncertainty and Doubt

Software development probably lead on this one. IBM was justifiably famous for FUD to ensure that customers followed IBM’s lead and did what they were told. IBM had the market and customers were supposed to do what IBM told them was good for them. With Climate Change, large organizations that will have to change to preserve the climate that is suitable for our current civilization, are spreading as much doubt as possible to delay the realization that business as usual is no longer possible. In Software Development the threat of Open Source is currently the target of lots of FUD and large corporations that are seeing changes on the horizon are doing all they can to preserve their business as usual.

Nobody wants to listen to the people who know what is going on

Software Developers are used to being overruled by people who do not really know what is going on. Sure sometimes it is genuinely a business decision, but often the business people are busy making technical decisions that they are not competent to make. In Climate Change, the scientists doing the work are being challenged by the political decision makers who do not have a clue about science. Realistically the political decision makers should accept the science and then start talking about the political choices that we need to make as a result of that science.

The results really matter

There are two things that our civilization depends on, working software and a livable, stable climate. The news is occasionally enlivened by the story of a major software project that costs hundreds of millions of dollars that fails to deliver the expected benefits. The smaller day to day losses from smaller failures are hidden. Similarly the big storms that are increasing in frequency and severity due to climate change are making headlines, but the smaller impacts are never visible in the news.

Making sense of the parallels

Still working on this one, but I have a sense that the political power structures have a big impact. The techno geeks that do software or science are not part of the larger political power structures that govern our corporate dominated societies. As such the techno geeks are marginalized and can be safely ignored … at least for now … obligatory link to Tim Bray - Doing it Wrong.

If you wait until it’s obvious, it’s too late… has become my defacto motto when it comes to a lot of things. I think the first time it occurred to me was when I started researching IPv6. The depletion of IPv4 is no surprise, and hasn’t been for quite a while, but it seems like most people are holding off even researching it until it becomes obvious that they need it. Again, by that time, if you’re in any kind of competitive company vying for market position, it’ll be too late. It won’t be obvious that it’s necessary until you see people financially punished for not taking those steps.

Lots of applicability to this idea, I need to ponder on it for a while. There are lots of changes coming, but which ones are going to matter and which ones do we need to take action on. Typically what happens is that larger corporations take longer to ponder on the new ideas, and then get hit hard by the market.

Fuel efficient cars are a good example of this. The profit margin for trucks and SUVs was so high that it did not make sense to switch to smaller engined cars, even though Peak Oil was going to push up the price of gasoline and Climate Change eventually going to put the price of emitting CO2 higher. Back in 2008 when the price of oil spiked for a few months, truck dealers where I live practically could not give the trucks away. Even with discounts that were most of the previous profit margin, the trucks were not moving off the dealers lots. Yes, the price of oil dropped again and most people tried to convince themselves that it was just speculators, but the car companies seem slowly to be waking up to the issue that fuel economy matters. The only problem is that the companies that held on to the idea of profitable trucks and SUVs the longest are being punished by the market now, since the lead time for getting a new design in the hands of the dealers is much longer than the time that companies have before the CDN$1.00/liter price of gasoline seems like the good old days.

Lots of applicability of this to software development as well, but that is for a later post.