Introduction

In this article, I will present the process of developing a product including:

creating the initial product backlog

creating functional tests

creating automated build

creating unit test

I've decided to use as product a GUI Windows Application because this is one of the cases where it is usually hard to obtain full unit testing coverage. The application is based on .NET Framework but the process presented here can be used for any kind of application, regardless if is Win32, .NET, Java, HTML.

Why Unit Test?

But why have unit test at all?

There are developers which don't like testing. Some of them don't like it because they are "developers" not "testers", others don't like it because is boring, others because they "know" that their defect rate is low, so they consider tests a waste of time.

I see myself as a very good developer and I think that my defect rate is very low, yet I've done many childish mistakes in my code and I always feel the need of unit testing and functional testing. Humans make mistakes. The mistakes can be done even in the unit testing code, but the probability to make the same mistake in the unit tests, in the functional tests and in the production code is very low, so it is worth having the tests. And also remember one of the biggest advantages of a good unit test: it detects the errors produced in maintenance phase of the product.

You make the test once, you check-it carefully until you are 100% that it really tests your functionality and detects unwanted changes and then you can ignore that functionality because you are sure that regardless of whatever changes you make in your product, that functionality still works as you expect.

"Testing First" vs "Testing After"

There are many testing strategies, here I want to discuss the two of them:

Testing first (known also as test driven development)

Testing after

I have some doubts on saying that these two are "common" strategies.

Testing first is considered an "Agile" test strategy and I know many companies which claim that their development process is test driven but in reality none that I know has a real test driven development. Why is this and what really implies "testing first"?!

Well, testing first implies many things, but all of them are focused on the core:

The developer has a clear and complete understanding of the final shape of the developed functionality before starting to develop.

Waterfall vs Iterative

How can be this? Is it waterfall? (waterfall development process is a strategy of deciding from the beginning all the evolution steps, in contrast with iterative development process which develops a small core, then goes back and checks if is good, then adds small functionality to that core and goes back again to see if is good and so on, until all is done).

Waterfall is not agile, iterative is agile. If testing first is waterfall, how can it be agile? The answer is "testing first is not waterfall". But why? And why waterfall is not agile? And why waterfall is not good?

Waterfall is not agile because it is hard (or more correctly almost impossible) to decide from the beginning every step of a product evolution. Waterfall may work well for a small product, but is often a road to disaster when used in big projects. This was proven in past years by many companies which used this approach.

Is easy to find if a company has waterfall development process. It is enough to ask the developers if they stay late at work, especially before deliveries. If the answer is yes, than it is highly probable that they are using either waterfall approach or they have no process.

Be Pragmatic

There are managers who demand usage of test driven development process because they heard that it is good. Their developers will say that all development is test driven but in fact none is (except when that manager is in the room). I know a case when a manager asked to have first written the test method which should call the not-existing production method because "this is the way". There are some development languages in which that may be needed (for example in PHP, where it may happen to call a wrong method), but in many is not.

Before choosing a way to do things is important to analyze if it is the proper way. This is part of pragmatic programming therefore is on the agile highway.

My advise to developers is as follows:

If you create unit testing do it because you feel the need, not because you heard that it is good. You must first understand why it is good and how it can help you. Otherwise it is better to not have unit test at all because an incorrect unit test will give a false sense of security.

Don't reject tests. Testing will help you to raise the quality of the product. Consider the test project as a production project, and it won't be boring.

Choose test first when it is appropriate (usually when you know exactly what the final functionality is)

If you don't know exactly what the final functionality is, use spikes, don't go directly in production. Using spikes (small projects outside of your production project with the scope of understanding a given functionality) will help you to understand what you need to do. The spikes will save your time because you concentrate your mind on understanding not on having a high quality output.

Don't, I repeat, do not start to develop something before having a good understanding of what you need to do.

Even if it is better to use test first strategy (as it proves that you've understood what you need to do), do not stay stuck by this. It is important to be pragmatic and understand when it is suitable to write tests first and when later.

Step by Step Development Process - Creating a GUI Application

I'll present here the development process for a small product - a GUI application.

The product will contain the functional requirements, the functional test project, the production project, the associated unit test project and the automated build project.

Prerequisites

Creating the Product Backlog

Discussing with the product owner, you will be able to agree on an initial product backlog. The product backlog is a table containing a set of user stories with their priorities and estimations.

A user story is a functionality visible to any product user (including the product owner). The user stories usually will not contain terms used in programming. It must be understandable by any user of your product. An example of user story is: "When clicking on browse button, a browse dialog will be shown". This user story can be understood by most of the computer users: there is somewhere a browse button on which you can click and clicking will open a new dialog. After you agree on this user story, you can split it more:

Where the browse button should be?

Can always click on it or sometime is disabled?

If sometime is disabled, when?

The browse dialog will be open on the center of the screen or where?

The browse dialog will be shown on task-bar?

The strategy here is to go from one question to another until you have some basic user stories.

Don't confuse this product backlog with waterfall. It is good to collect as much information as you can before starting work. Collecting information, deciding on a component layout is not waterfall. When initially discussing with product owners, I clearly state that what is discussed is not the final shape of the product but is the first visible aspect of it. Therefore I clearly state that having done what we discuss doesn't imply that the product is done. We create the core of the product and we keep adding functionalities to it until the product owner says that is enough. Having visible the product evolution is mandatory for the success of the product.

Example of User Stories on Initial Product Backlog

Here are the user stories from the product backlog of the application which is presented in this article as a sample:

Create clock graphical object

has a circular form

has a 3D border

has a nice background

has analog look

show hours and minutes

can be moved with the mouse

can be kept on top of other windows using a button on its tool bar

moving mouse over the clock shows tool bar

moving mouse outside clock and tool bar hides the tool bar

hour and minute can be changed using a configuration dialog

border color can be changed using a configuration dialog

background color can be changed using a configuration dialog

settings are auto-saved and persistent

clock can be closed by pressing a button on its tool bar

configuration dialog can be open using a button on its tool bar

Create clock tool bar

has a button for closing the clock

has a button for keeping the clock on top

has a button for opening configuration dialog

Create clock configuration dialog

can change the current system time

can change the clock border color

can change the clock background color

It can be noticed that terms used in these user stories are computer related (because are user stories of a software product) but are not programming related.

Choose the Best Iteration Length

When you first discuss with the product owner the initial product backlog, you should decide on the optimal iteration length. An iteration (we call it sprint) is an interval of time in which the development team (including QA) will work on a given set of user stories. Before the iteration starts, the development team (including QA) will select the user stories to be done (100%) in that iteration (according to their priorities) and when the iteration ends you will present the finished user stories to the product owner.

With this approach, the product owner will know at any time what is the state of the product and will be able to correct the deviations.

The iteration length must be long enough to give time to the development team to finish the selected user stories, but short enough to minimize the costs of changes requested by product owner.

Write Functional Tests

The next stage in the development process is to write the functional test for each user story. Having functional tests will prove that the application is satisfying the requirements.

Usually the functional tests are written and implemented by QA members. They will take each user story and they will try to imagine the simplest way to test it.

Example: User story: has a circular formFunctional test: move a window form under the tested application and check that:

only a circle is covering the moved form

it is possible to click on the moved form anywhere in it - outside of the tested application circle.

So with only two simple tests, it can be proved that tested application has indeed a circular form.

Write Development Stories for the First Sprint

The user stories are not independent, there are interactions and dependencies between them. It is hard for the programmers to develop the user stories while they are working in a team because of these dependencies. Before each iteration, the development team should choose the user stories which can be done and convert them into development stories.

The development stories are the user stories grouped or split for grouping interactions and dependencies. In scrum books, these kind of stories are not mentioned but from my experience they are useful.

Usually there are 3 development stories which are common to all projects:

Create the initial product backlog

Write the functional tests scenarios

Set-up automatic build

These development stories have no match in the user stories but the time spent for completing them must be added to the total time required by the project.

The last 3 development stories are depending on the "create clock tool bar" story, so a developer must first finish that one before starting with them. The rest of the stories are independent and can be developed in the same time by different developers.

Create Solution and Configure Automatic Build

The first thing to do after creating the production project "DesktopClockWidget" is to create its unit test project "DesktopClockWidgetTests". I intend to use NUnit as unit testing framework so I've added in test project references to:

nunit.core

nunit.core.interfaces

nunit.core.tests

nunit.framework

Also, I've added reference to the production project:

DesktopClockWidget

I need to check that automatic and nunit build are properly set up and the easiest way of checking this is to create a test class with one or two test methods that are throwing a NotImplementedException. I will create this class before setting up the build because in this way I will have a checkpoint all the time (and yes, it is a test driven way).

At sprint meeting, I've chosen to implement the "settings are auto-saved and persistent" development story so I will create the test class for this story.

For simplicity, I will keep the clock settings in XML file. Since I intend to have only one clock on the desktop, I won't implement an engine for having unique settings for each application.

So, because I want to keep it simple stupid I will have one settings XML file and a singleton class as layer for the settings. I will call this singleton class: XmlClockState.

Now I have enough information to create the unit testing class XmlClockStateTests. First settings which I will need to save are "left" and "top". I will fill the test class XmlClockStateTests with empty (throw NotImplementedException) test methods for left and top:

http://build-server/ccnet/server/local/project/DesktopClockWidget/ViewProjectReport.aspx is the URL where the reports for DesktopClockWidget can be found (on your local intranet)

svn-server is the name of the server where svn repositories are located.

https://svn-server/svn/Projects/DesktopClockWidget is the URL where the tested project sources are located. (you can find many tutorials on the internet on how to set-up the svn server with apache)

D:\Projecte\Programe\DesktopClockWidget is the location on the disk where svn sources will be copied for automated build. (location is on build-server)

C:\Program Files\SlikSvn\bin\svn.exe is the location of the svn client (on build-server)

C:\WINDOWS\Microsoft.NET\Framework\v3.5\MSBuild.exe is the location of msbuild (I use 3.5 framework because the project is created with Visual Studio 2008 Express Edition)

Now that I've started ccnet.exe, I can check if the automatic-build is working. I access the http://build-server/ccnet/server/local/project/DesktopClockWidget/ViewProjectReport.aspx link and I should see the cruise control dashboard page with DesktopClockWidget project.

Create the GUI Test Project

As I've told before, this is a GUI Windows application. It will open message boxes, it will show dialogs. I need to interact with these windows, to validate them and to close them. I can do this using GUI Automated Test.

Open GUI Automated Tests and create new project (Project -> Create New). I will name this project "DesktopClockWidgetGuiTests".

At this stage, I have everything set and I can start the development process.

Creating Test Case for an Exception which Displays an (win32) Error Message Box

There are many scenarios in which unusual context must be handled. The professional approach is "garbage in, clean out". This means that regardless of the system context, the application must handle its input data and give a meaningful response, that will help the users to continue their work (and eventually will help you to fix the bugs).

Here I will present the unlikely scenario in which an invalid configuration file is found and an expected XML tag is missing. There are many ways of handling this. For this application, we've decided to display an error message dialog.

is overwriting the "ClockState.xml" with a valid XML which doesn't contain the "Top" tag.

is starting the GuiTestCaseRunner and runs asynchronously the test case "Validate missing Top tag error"

is querying the top value from XmlClockState (this will throw the exception and will display the error message dialog).

is waiting for the GuiTestCaseRunner to finish its tests

is validating that GuiTestCaseRunner has finished without errors.

After writing the unit testing method, you must write the gui test-case. Start GuiAutomatedTests application and create a new test case with the name "Validate missing Top tag error". In the new test case, add a wait script with the interval of 1000 milliseconds (this will give time to error message dialog to appear). The next step is to put in comments the lines which are using the GuiTestCaseRunner, build the test project and use NUnit to run the TestMissingTopTag. This will show the error message box dialog, allowing to validate it:

For the first operation, give a meaningful name to the capture, for example "validate missing top tag error message" and start the capture. Perform a left click inside the message dialog - not in Ok button - (to be able to select the dialog properties, like text, size, style) and then stop the capture. The scenario containing the click action will be added in GuiAutomatedTests.

Now, select the click action and from the right panel use "Add test case" button and from the properties form select the error message dialog, check the option "Add selected tests also for any child of the window", and choose "break" in all properties combo. At this moment, the GUI test will validate the error message dialog properties.

The next step is to close the error dialog. Start a new capture with a meaningful name, for example "close top tag error message" and record the click on "Ok" button. After stopping the capture, the "close ..." scenario will appear in the left panel actions list. Since this is a clean-up scenario, we need it to be executed even if previous scenarios are failing. Remove the scenario from the left panel actions list and add-it in "Clean-up actions" box. Save the GUI Automated Tests project and close the application.

The last step is to validate the test. Remove the comments of lines which are using the GuiTestCaseRunner, build the test project and use NUnit to run the TestMissingTopTag. If all the steps were performed correctly, the test should pass. After checking that the test has passed, go in production code and change the error message. Build the solution again and run the test with NUnit. If the test fails, it is a proof that it is valid. Restore the error message and commit the change.