A History of Build Systems

In my younger days, before I knew any better, many projects I worked on compiled and published their software manually.
You'd type cc and then copy these bits over there and then zip that directory and post it to there.
Eventually, we figured out we could write little scripts to automate all the tedious bits and make it less fragile
and more repeatable.

One day, I discovered the discipline of daily builds and tools like make and my life got a whole lot better.
Make gave us, in Elizabeth's handy phrase "a place to put things".

In Influence: The Psychology of Persuasion, Robert Cialdini describes various techniques for making people do things that, if they were thinking clearly, they would otherwise not do because of lethargy, laziness, or because it would offend their better judgment.

One of those techniques is The Commitment Principle which was used on American POWs to great effect by the Chinese during the Korean War.

I have just generated unit tests for some code that would have taken months or years to do manually and I did it in under 30 minutes including registering on your server and waiting for the reply email.

That is truly awesome!"

Getting this kind of feedback is the fun part about having free (as in beer) software up on the web where anyone can try it out. In this case Nick had a great experience with JUnitFactory and let us know with a post to our forum.

I'm very excited that with two weeks remaining before the conference we've reached the 100 mark for registrations for CITCON Sydney. This is our fourth CITCON event but our first in Australia, so it is great to see so many people registered. We'll take up to 150 so if you're interested in attending, it's not too late to register. The conference will be July 27th and 28th, Friday night and then all day Saturday.

Btw, if you're not familiar with the Open Space conference format you might want to read this description. If you're still not sure if this format or event is for you, you might want to see what people have posted about past CITCONs on the web, or read the feedback from Dallas earlier this year.

Tomorrow night, I will be giving a talk at the Association of C and C++ Users in Silicon Valley entitled, "To Catch a Bug, You have to Think Like a Bug". This is a new and improved version of a talk I gave at SD West, so if you didn't get to go there, you get another opportunity to check it out. I hope to see some Agitator's there. You can find out more of the details at the ACCU's website.

I am going to be attending sessions at JavaOne this week, and would be happy to meet with any Agitators or testing enthusiasts at the conference, according to Sun's Event Connect tool, I can paste this code and you can link to me in their event tool to set up a meeting.

Many language platforms, like Ruby and Microsoft .NET ship with a unit testing framework as part of the platform. Why not include JUnit in the Java Platform, or at least include it in the JDK? Code quality is a constant sore spot for commercial applications, so it seems like making the tools that contribute to higher quality more widely available will encourage better code. While we're at it, lets put in a code coverage tool as well, so we can see how well we're testing. We already have some profiling and management tools built in, so this seems like a missing piece of the puzzle.

I'd like to discuss this idea, and concerns around improving code quality with developer testing in general.

“I am ready to write some unit tests. What code coverage should I aim for?”

The great master replied:

“Don’t worry about coverage, just write some good tests.”

The programmer smiled, bowed, and left.

...

Later that day, a second programmer asked the same question.

The great master pointed at a pot of boiling water and said:

“How many grains of rice should put in that pot?”

The programmer, looking puzzled, replied:

“How can I possibly tell you? It depends on how many people you need to feed, how hungry they are, what other food you are serving, how much rice you have available, and so on.”

“Exactly,” said the great master.

The second programmer smiled, bowed, and left.

...

Toward the end of the day, a third programmer came and asked the same question about code coverage.

“Eighty percent and no less!” Replied the master in a stern voice, pounding his fist on the table.

The third programmer smiled, bowed, and left.

...

After this last reply, a young apprentice approached the great master:

“Great master, today I overheard you answer the same question about code coverage with three different answers. Why?”

The great master stood up from his chair:

“Come get some fresh tea with me and let’s talk about it.”

After they filled their cups with smoking hot green tea, the great master began to answer:

“The first programmer is new and just getting started with testing. Right now he has a lot of code and no tests. He has a long way to go; focusing on code coverage at this time would be depressing and quite useless. He’s better off just getting used to writing and running some tests. He can worry about coverage later.”

“The second programmer, on the other hand, is quite experience both at programming and testing. When I replied by asking her how many grains of rice I should put in a pot, I helped her realize that the amount of testing necessary depends on a number of factors, and she knows those factors better than I do – it’s her code after all. There is no single, simple, answer, and she’s smart enough to handle the truth and work with that.”

“I see,” said the young apprentice, “but if there is no single simple answer, then why did you answer the third programmer ‘Eighty percent and no less’?”

The great master laughed so hard and loud that his belly, evidence that he drank more than just green tea, flopped up and down.

“The third programmer wants only simple answers – even when there are no simple answers … and then does not follow them anyway.”

The young apprentice and the grizzled great master finished drinking their tea in contemplative silence.

In May 2006, an ill-prepared international expedition to the Himalayas lost its way. After two weeks of wondering around, hungry, thirsty, and smelling like inexperienced expeditioners who got lost for two weeks, they stumbled upon the entrance to an ancient cave.

Once inside, they saw a maze of ancient, and messy, cubicles. Each cubicle had a wooden desk, an ergonomically correct bamboo chair, a Dilbert™ calendar, and a strange computer-like mechanical device. In one corner of the office they found barrels of dark liquid which they later identified as early examples of carbonated and highly caffeinated drink and a ping-pong table. They realized that the cave was an ancient software start-up. The oldest one on record. Older even than Netscape.

Among the many things they discovered inside the cave was a note left by one of the programmers. The expedition’s guide, while not very good at guiding, knew how to read the ancient language and translated the note for them:

We have finished the release ahead of schedule – again. All the tests pass, so we are taking the rest of the week off. We are going sailing. Since it’s a team building exercise, we hope we can get reimbursed for it.

The explorers looked at each other in astonishment. Not only had they discovered the oldest software start-up in history, they had also discovered a team of programmers who, apparently, completed their code ahead of schedule ... on a regular basis!

What was the secret of these ancient programmers?

And what had happened to them?

The expeditioners searched each cubicle for clues and found two mysterious booklets. One of them was called "Learn To Sail In 30 Minutes”, which explained the fate of the programmers. You are holding in your hands a translation of the other booklet: “The Way of Testivus”.

Who wrote this mysterious booklet? What is Testivus? Only Google™ knows for sure.

Is the content of this text responsible for these ancient programmers being able to complete projects ahead of schedule?

We can’t be sure, but we believe that the amazing prowess of these programmers was probably due to a combination of the Testivus philosophy, and the consumption of large amounts of the dark caffeinated liquid found in the cave.

I hosted a JUnit Factory presentation a few days ago (you can
watch
it online if you missed it first time around) and spent a fair amount of time talking about the Triangle sample
in the JUnit Factory demo.

Sometimes an aphorism like “good development depends on good communication” doesn’t really sink in until it hits you upside the head. I experienced the reality behind this particular maxim recently when I expanded the number of developers on an open source project from one developer, myself, to two.

Tomorrow morning, I'll be giving a talk at SD West 2007 on developer testing. It is a a very opinionated look at how to test your code. It should be fun and useful. If any Agitators or other test afficionados are going to SD West, it would be great to see you at the talk, or afterwards as well.

Before I move on to spares and strikes, it would be nice to see how the score sheet looks. In
an earlier post,
I claimed that one of the reasons for integrating the UI early is to make sure the domain model will satisfy the
requirements of the user interface. Let's see if it does.

In my previous blog entry, I posted a set of acceptance tests for the first few stories. It's time to start writing the code to pass those tests. I prefer to discover the design through TDD rather than code directly to the customer-facing tests.

I have often written acceptance tests for code that has not yet been written (in fact, I wrote an article about it) but I have never written tests that will work with any number of implementations, each with their own architecture. I don't even know how to go about it, but that never stopped me before...

"scoring a game of bowling" is probably the most common application used when demoing TDD. It's so commonly known among the JUnit crowd that I chose one of Bob Martin's efforts as a demo for JUnit Factory.

The topic comes up about once a year on the TDD mailing list and it just came up again. By an odd coincidence, we just celebrated the completion of a new release of AgitarOne with a trip to Homestead Lanes, so I am all fired up about bowling despite my dismal performance (there was beer involved).

We are proud to announce that registration is now open for the next Continuous Integration and Testing Conference, CITCON Dallas on April 27th & 28th. Space is limited to the first 100 registrants and attendance is free. Registration is on-line and while it is open until April 13th we do expect to fill all the available slots, so sign-up soon to reserve your spot.

The conference will be following the same Open Spaces (or unconference) format as the 2006 CITCON in Chicago and London. These prior CITCON drew enthusiastic practitioners at all levels of experience, all looking to share what they knew and to learn what they could. We're looking for more of the same in Dallas.

Please help spread the word about CITCON and we look forward to seeing you there!

"The older schools gradually disappear. In part their disappearance is
caused by their members’ conversion to the new paradigm. But there are
always some men who cling to one or another of the older views, and they
are simply read out of the profession, which thereafter ignores their work."

In computing, there is no mechanism for reading such men out of the profession. I
suspect they mainly become managers of software development. "

I suspect a large number of the adoption problems for developer testing are in organizations where the old boy at the helm is clinging to an outmoded paradigm of software development. Perhaps those guys would listen to Floyd -- (Robert, not Pink.)

CITCON London registration will be closing this week, it is just hard to know if we will hit the deadline of Friday September 22nd or the cap of 120 people first! At last count we have 85 people signed up and if history is a guide the remaining spots will go fast.

For those who aren't aware CITCON is the Continuous Integration and Testing Conference, a free open spaces (or unconference) event on continuous integration and the testing that goes with it.

Given the open spaces format it is impossible to predict what the exact session topics will be but a sampling of the topics (and notes) from the CITCON Chicago event from earlier this year is available on the wiki. Also available are some photos, feedback, links to related blog entries and more... More than enough to be convinced that you should sign up today!

I just wrote a page on our internal wiki with our policy for dealing with build failures. We thought others might find it interesting so I am sharing it here (the links will be broken for obvious reasons).

Executive Summary

Test-Driven Development is often introduced through simple examples, but many developers would rather dive into the deep end. This free webinar is for those people. J. B. Rainsberger, author of "JUnit Recipes," will show you architecture and design strategies to make it easier to "test-drive" J2EE components. You'll learn how to build a J2EE application while following the cardinal rule of Test-Driven Development: Never write a line of production code unless somewhere, a test has failed.

The recorded webinar is up on the Agitar website linked from this page. Registration is required but if you'd rather not register try out bugmenot.com.

There are many reasons why you might not want to fix a failing test right away. Maybe it's an acceptance test for a feature that you haven't written yet. Maybe it's a regression that it's just not practical to fix right now.

It's like what Whitehead said about notation:
"By relieving the brain of all unnecessary work, a good notation
sets it free to concentrate on more advanced problems, and in
effect increases the mental power of the race."

... but that doesn't do it justice. Maybe better is to quote Kevin's reaction:

Outstanding post, Brian. I always wondered what the little star was for in GMail. Now I know. Your post has a little gold one next to it.

Next week Agitar is hosting Ted Husted of the Struts development team (and iBATIS and MyFaces and Jakarta-Commons) in a webinar on What to Do if Your Code Has Few, If Any Tests?. I'm curious to hear what Ted has to say but his talk illustrates the kind of trade-off that exist as our company grows. One the one hand we can having interesting speakers and topics like this, but on the otherhand we're now scheduling the webinars for a global audience, so the scheduled times (7 am and 5 pm PDT) probably work better in just about every other timezone than this one. (</whine>)

1. Test your own code as well as you can
2. Next time someone checks in a slap-your-forehead bug, show them the test that would have caught the bug if only they had run the tests.
3. Show them how to run the tests themselves

It won't be long until everyone will want to have tests.

Most developers will not make the investment until they have seen proven returns. The management challenge is to find the early adopter with the courage and vision to take that first step.

That suggests that a specification should not be written to a consistent level of precision. Precision is needed only where disputes have already occurred or are likely.

Translating that into developer testing I immediately thought "That suggests that unit tests should not be written to a consisten level of precision. Precision is needed only where disputes (or confusion or bugs) have already occurred or are likely."

I have these rules that I use for unit tests, primarily because I
encounter so many teams that start writing end to end tests, call them unit tests, and give up because "testing takes too long". To me, a test is not a unit test if:

It talks to a database

Communicates across a network

Touches the file system

You can't run it at the same time as any of your other unit tests.

You have to do special things to your environment to run it (like editing configuration files).

Tests that do those things are okay, but to me they aren't unit tests, and they should be segregated from true unit tests.

Jun 15, 2005

–

Open Quality Data in the Annual Report?
Kent Beck told me that he showed the Agitar Management Dashboards at a public company where he was consulting about software development process. Then he asked them what the impact would be on their software development organization if they had to track such data for their key software applications, and then publish it on their public website. And if they also had to refer to the data in their annual report and explain any changes in trends. more »

On Thursday May 26 Jon Udell interviewed Alberto Savoia. Alberto had just been selected by InfoWorld as one of the 25 Top CTO's in the US in 2005. Jon asks Alberto about his vision for Agitator and Alberto demos Agitator extensively.

You can see the replay now if you don't mind registering. If you do, come back in a few days and we'll point you to an unguarded version.

Apr 18, 2005

–

Sustaining Legacy Code
Last month I visited some customers and prospects in India. One of them was an outsourced provider of software engineering services. Sure enough, some projects they get are to maintain and sustain an existing body of code. "So how does Agitator help?" they ask. more »

Needs and Wants
Recently while chatting with Alberto the phrase "what the user needs and wants" came up in our talk. That got me thinking a bit, what's the difference. It struck me finally that, that's the core difference between a market success or not. more »

The Developer Testing Burden
I started at Agitar (Engineering Management) in August - but have been associated with the team at Agitar for the last several years. In my past life at SunTest (Sun Labs), I was responsible for creating several Java testing technologies (primarily focused towards load testing) - and being a geek at heart - the "toughness" of the testing problem has always kept me charged up about solving it. more »

The Feng Shui of Developer Seating
Over the past year we've had several different variations on our seating, with the common theme of an open workspace. Originally there was an 8 desk circle, which in time was supplimented with some cube space. Then we broke off a separate team for The XPeriment and added a separate 'pod' of 4 tables. With several minor variations all these seating options worked well and I felt pretty safe thumbing my nose at thosepeople who advocate offices with doors that close as the best environment for development teams. more »

TDD and Agitation
I have been doing Test Driven Development (TDD) for about four years now and agitating for a little less (it's my second
anniversary as an agitator today) and I have thought a lot about how to marry the two testing styles.
A discussion on the TDD mailing list today
(http://groups.yahoo.com/group/testdrivendevelopment/)
finally gave me a name for what I have been doing for a while. The discussion centered around the relative merits of
TDD versus Design by Contract (DbC) and a surprising - surprising to me anyway - number of people said that two
are complementary and that they do both. That's exactly what I have been doing without realizing it. more »

Testing HTML Pages
I started this article intending to talk about a technique we developed for testing Velocity templates but realized that there was enough background material for a separate article on testing html. So, this entry describes how we developed a harness for checking the output from the Management Dashboard. A second entry will talk about how we adapted the harness for testing Velocity templates. more »

Mar 8, 2004

–

Fight Complexity with Complexity
We were using method and class and package names as keys to the various data structures that store test results and coverage data. Agitator told us that if you pass the string "123%%^*abc" to a method that expects a class name, it throws an IllegalArgumentException. "Well, duh!" we said and marked it expected. We added a factory to generate a variety of good and bad class names and got on with the task at hand. more »

Why Is Software So Hard To Test?
Start a conversation with any developer about unit testing and, before long, he'll tell you that automated testing is a fine thing in principle but that his code is too hard to test because .... more »

–

Continuous Integration, Continuous Agitation
With automated tests, like money in the bank, the joy should be the using more than the having. But unlike money, you can use the same test again and again, and we've found that being a spendthrift with our cpu cycles is the best way to get the feedback we need to drive our product quality. more »