Search This Blog

Welcome to Machers Blog

Blogging the world of Technology and Testing which help people to build their career.

Monday, April 6, 2009

Barriers remain for test-driven development

Theme:-Test-driven development (TDD) is a little like saving for retirement -- you know your quality of life will be better in the long run if you do it, but you may feel you can't afford to put that money aside right now, or you don't have time to think about it.

The benefits of TDD are quickly realized as IT can respond to requirement changes and enhancements more rapidly and with predictable results. Bobby PantallLead technology consultant, CC Pace Systems Inc.

At a recent roundtable on TDD hosted by Stelligent Inc., a consulting firm in Reston, Va., attendees were surveyed on this issue, and the results were "surprising," according to Burke Cox, Stelligent CEO. Among the CTOs and IT and project managers in attendance, 84% do not currently practice TDD, and 79% said they do not measure code coverage for development projects. With the practice of test-driven development, developers write automated unit tests defining the requirements of the code before they write the code itself. TDD has its roots in Extreme Programming (XP), although it is not limited to agile development, Cox said. The two main barriers to TDD are cost and culture, according to Cox. "First, you have to realize the value of testing and invest dollars upfront," he said. "At a micro level it costs more to write test and code, than just to write code. But if you continue for five weeks, then the person writing test and code will be much more productive than the person just writing code, because you're constantly testing and refactoring. You will deliver faster and it will be cheaper [in the long run], but management has to understand that." Roundtable attendee Bobby Pantall, lead technology consultant at CC Pace Systems Inc., a business and technology consulting company in Fairfax, Va., agrees. "Organizations haven't been sufficiently informed of the business benefits of TDD," he said. "In our experience, there is a popular misconception that it's wasteful to spend so much time writing tests instead of pure functionality. However, over the lifetime of an application, the majority of time is spent in maintenance mode. The benefits of TDD are quickly realized as IT can respond to requirement changes and enhancements more rapidly and with predictable results." Developer buy-in neededDevelopers also have to understand the value of test, Cox said. "Developers cross their arms and say, 'I write code, not test,'" he said. "Test historically is second-class citizenship; it doesn't get the kinds of responsibility and respect developers or architects get. To change this attitude, you need a change of commitment from business people, who own the line of business. They have to insist teams do it [TDD]." Those who practice TDD cite benefits such as a growing log of regression tests, better maintainability, reduction in development time and few defects, simpler and more extensible system design, improved code quality before reaching the testing/QA phase, and faster feedback, according to the survey. Roundtable attendee Luke Majewski, director of application architecture at Intalgent, a custom software developer in Reston, Va., said TDD forces you to think about the testability of your code beforehand.

Problems with test-driven developmentAt last week's Better Software Agile Development Practices conference, James Coplien, senior Agile coach, software developer and systems architect at Nordija, spoke about the problems of using test-driven development (TDD).

The biggest issue, he said, is that TDD creates a procedural architecture rather than an object-oriented architecture. And a procedural architecture weakens an object-oriented architecture and "in time destroys usability and maintainability."

Coplien said development groups should instead use a lightweight, upfront architecture. That will help you avoid architecture rot, reduce maintenance cost and reduce usability problems.

He also said to remember to use a system testing program driven by use cases.

--Reported by Michelle Davidson, Site Editor

"Even while you're designing the architecture you get to think about, 'If I use a framework in this manner it might not be testable.' If something's not testable, it's difficult to code. And once you start writing code that's not testable, it makes it impossible to make changes in the future, and finding bugs is more difficult," he said.CC Pace practices TDD extensively, Pantall said. "Before we implement any functionality, we write an automated test that formalizes the contract of the requirement. While we currently do not use any tools to measure code coverage, we are confident that 100% of the functionality is covered by tests," he said, "TDD by itself may not guarantee performance improvement, but automated testing of any sort, TDD or otherwise, will result in significant improvements in quality. TDD tends to improve code coverage, which translates to better quality." TDD also improves maintainability, Pantall said. "Supporting our code with automated tests allows us to perform regression testing every time changes are made," he said. "Ideally, every time we check in our code to source control, an automated build occurs which runs through our entire test suite. If the change broke something else, we know right away. " Pantall also said that automated tests give developers the confidence to make large-scale refactoring or functionality changes to legacy code. "We can immediately find out what functionality our changes may have affected. This does not obviate the need for dedicated system testers, but it can significantly reduce the burden on them," he said. As with any methodology, developers sometimes adapt TDD to their needs. "I do not write tests first; I'm a dinosaur," said David Medinets, roundtable attendee and president of Affy Agile Advice and Coding, in Fairfax, Va. "I've been doing this for a long time; it's hard to change the way your brain thinks. I use test and development to guide how I'm writing code, so I write tests as I code." According to Majewski, "By our definition of test-driven development, we write code that's testable; we write code, then we test. If something is checked into our continuous integration server, it needs to have a test, with very few exceptions." While Stelligent's Cox said the best practice is to write the test before code, "we understand the challenges of companies adopting that more radical approach. At the least, have test associated with any code that is changing. Whether you write tests before or after coding is a question of internal debate."Medinets said that while he tries to push his clients in the direction of writing unit tests, and he builds into his estimates the time to write tests, he said it's harder to build code coverage into the cost structure. "Typically I'd say 25% of the projects I work on do code coverage; they have budget and time constraints," he said.In the end, Majewski said, seeing is believing. "The common thread when we have this discussion with other developers is their companies don't do it because it doesn't make sense financially. It isn't until you complete a project fully, with good test coverage, that you really appreciate what it does," he said. "There's a level of disbelief from managers that test coverage will help. People need to give pushback to managers and give them proof that test-driven development works."