Sifmole has asked for the
wisdom of the Perl Monks concerning the following question:

So, I have managed to convince ( it didn't take much, because they are smart ) my immediate managers that adding unit test creation to the development cycle is a good thing.

They do want to know an estimate though of how much such an addition would affect the development time for tasks. For instance, without writing tests a developer could complete a certain task in 5 days; how long would it likely take for them to complete the task if writing unit tests was included in the task requirements.

I understand that the benefits are more confidence in the quality of the resulting code, quicker QA cycle ( potentially, or maybe just more effective QA cycle ), ease of future maintenace ( because you can regression test easier for instance ), etc, etc.

But what is the change to the development time cycle? Are there any experiences or articles you can relate/refer me to? I guess a great answer for the next step up would be if it could be summarized as kind of an average percent change....

Also, consider that the dev group is experience in coding, smart, resourceful -- but not used to unit testing. How would the percentage change as they got more experienced with including unit tests in their development process?

I ask here because I know there are several monks who are big proponents of this, and I hope have experience to relate; and we are talking Perl for at least the first group to do this.

thanks in advance.

Comment on
Adding Unit tests to the development cycle
Replies are listed 'Best First'.

Testing is not about saving time in the short-term. Testing takes time and learning to write good tests takes time. I'd estimate that on my current project we spend anywhere from 25% to 50% of our time writing and maintaining automated tests. I find claims that you'll cut time on small projects by writing tests to be unsupportable. Larger projects may show gains as the complexity of a job mounts, but I don't have any hard data to show you.

One benefit of testing you didn't mention is increased flexibility. There are a whole class of changes that you simply won't do without a complete regression suite to back you up. Major refactoring is basically unthinkable without tests. That means that code without tests ages faster and has to be rewritten more often, a task with an enormous price-tag for a complex system.

I think your best bet is to give it a try and see how it goes. Tell your boss you think it'll add a small amount of time to the project but that the pay-off will be well worth the wait. If your boss is smart he'll understand why you can't be more specific till you try it!

I find claims that you'll cut time on small projects by writing tests to be unsupportable. Larger projects may show gains as the complexity of a job mounts, but I don't have any hard data to show you.

As a counter point I find that doing test-first testing does cut the time - even for smaller projects. The time I spend writing tests is more than made up by the time I save by not debugging broken code.

Of course I too have no hard and fast numbers - just personal experience :-)

leocharre is correct in that tests reduce development time in the long run, but it also reduces development time in the short term. By writing a test, you figure out how the code you're about to write is intended to be used. That will help focus your thoughts.

Basically, something that takes 5 days without tests will generally take 3 days with tests, including the time to write tests.

My criteria for good software:

Does it work?

Can someone else come in, make a change, and be reasonably certain no bugs were introduced?

What else are they doing besides writing code that takes 5 days? Are they doing the design, documentation, informal tests to see if it works?

The other question to ask is, how thoroughly tested is the code? Devel::Cover can tell you how much of your code has been tested. To test really obscure conditions might require more time than it is worth but have testing coverage for 85% of your code might be considered a reasonable condition for the testing code is thorough enough.

For me, I have a ratio of about 2.5(hour) testing to 1(hour) programming. However, I am not spending time writing simple tests that I will not keep. A large part of my tests are written while I am designing the interface. So there is alot of intermingling of design, and tests with actual coding.

Coding is easy, testing code takes time because you need to verify the results and check to see where it is breaking. Programmers are doing testing, however most don't follow a formal system which their tests are documented and are repeatable. So don't be suprised if your coder is really spending 1 day in design, 1 day coding, 1 documenting, and 2 days testing.

Have you explained to them that in the long run, this cuts down on development time?

I spend time testing, debugging, retesting, re-debugging - and would say my new projects take 4x as long as they could take.

I am under no pressure to do it one way or another. Personally I am trying to save myself time. I can start up a project, make it look finished in x time. I know it will be ok for a while. I just know that in 365 days, someone will come b1ch1n9 that something blew up somewhere. No way.. This makes a company look like crap, for one. It's embarrassing, and it can be an unholy mess.

If your stuff is for more then one client.. spend the time, please. We know you can't predict exactly how long some of this stuff will take. We also know they want to hear a number. Will you tell them the honest truth or a pretty lie? I think the pretty lie will ultimately cause the most harm. What do the people you work with think about this? They are the best people to ask. And you have to trust them with their judgement and honesty.

Assuming that they are testing already, just not doing automated unit tests, then from experience I'd say the breakdown would be roughly code and people change the numbers, bit this is my best guess):

20% increase in time to start with (mainly due to having to write the tests as code, getting comfortable with the testing API etc. ... testing framework stuff).

20% decrease in time longer term (mainly due to automated tests finding bugs quicker and things like instead of having to spend a long time wondering if change X will cause problems you can just do it and run the tests).

For instance, without writing tests a developer could complete a certain task in 5 days ...

The cynic in me asks, "How do you know a task is complete without tests to show that the code works?" There is also the other side of the coin, which is that you cannot afford to unit test everything throroughly. Somewhere in the middle is a happy medium.

Initially, there are two costs of writing unit tests, but the cost isn't so much in the time spent as the effort. For unit tests to be effective, you need a software design that lends itself to unit testing, and you need good tests. You'll have to learn how to create those through training and experience. While the design is often done by a few senior people, everybody on the team will need to know how to write tests efficiently and effectively. That is, quickly write a small number of tests that catch a lot of bugs; do not spend a long time writing many tests that are unlikely to find bugs.

From my own experiences, I usually lose nothing by writing unit tests. It is rare that I write any sizable amount of code with no serious bugs in it on the first try. The unit tests find the big bugs, and that means I don't have to debug the whole stack. So, as a rough approximation, the amount time spent remains the same, but the quality goes up and the amount of frustration goes down.

Once your team learns effective unit testing, I expect that the estimated time to complete a task will not change significantly. What you will notice, though, is that with unit tests you more reliabily get the tasks done in the estimated time. Predictability is really what the boss wants anyway.