The SitePoint Forums have moved.

You can now find them here.
This forum is now closed to new posts, but you can browse existing content.
You can find out more information about the move and how to open a new account (if necessary) here.
If you get stuck you can get support by emailing forums@sitepoint.com

If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

TDD, I love you

I just have to say that test driven development tonight gave me the biggest benefit it has done to date! I've spent the past couple of weeks on and off refactoring a massive code base which involved some fairly impressive interface changes. All my tests exploded initially (fantastic, I'd hope so too) but as the code gradually came back together the failing tests disappeared one by one. Today I finished off my refactoring, 100&#37; passing tests, and only then did I realise "hey, I haven't even tried this stuff for real other than watching tests pass/fail". So I ran my smaller set of smoke tests which are basically my end-to-end tests, and voila, everything works like it did before I refactored

I can say for sure that without the 77 test cases this library has I would never have even dared attempt the scale or refactoring I did.

I know it's obvious that testing helps in this case, but when you finally get that level of support back out of your tests you have to take a bow to all the TDD proponents who's test infected attitude rubbed off on you in the first place

It seems like the benefit you actually realized was just having a complete set of unit tests, which is the prerequisite for refactoring. The benefit you derived from TDD was that your initial test set actually was complete, and therefore allowed you to have the comfort level to approach the large refactoring. Is that the case or am I mis-interpreting your situation?

I don't want to insert something off-topic here, but, what kind of resources did you use to learn how to do TDD effectively?

The reason I ask is because I just started a new app project for a client. I worked up some functional mockups so we could get the requirements straightened away, with the intent of writing sloppy but functional code for the mockups and refactoring for the production app. When all is said and done the app will include a full suite of unit tests, but I realized this morning that this is a great opportunity to learn TDD. Create a full set of tests for my application, write enough fast, sloppy code to satisfy them all, and then spend 75&#37; of my development time refactoring into a sleek, efficient, well-written app. Is that pretty much how TDD is supposed to work? Because it sounds like a good idea to me.

Where did you go during your learning process? I guess my main hangup is that I know how to write unit tests for code that's already written, but am not quite sure how to write them for requirements that aren't physically coded anywhere yet.

Create a full set of tests for my application, write enough fast, sloppy code to satisfy them all, and then spend 75% of my development time refactoring into a sleek, efficient, well-written app. Is that pretty much how TDD is supposed to work? Because it sounds like a good idea to me.

It's not how it works for me. I find that using TDD, forces me to make better code from the beginning. That also means more work (Although only slightly). What it adds, is not as much that it frees you from doing it right from the start, but enables you to make adjustments later on. So instead of supporting a two-step procedure, where you write a prototype and when throw it away and write the real thing, TDD lets your application evolve gradually. So that's almost the opposite of what you're describing.

It seems like the benefit you actually realized was just having a complete set of unit tests, which is the prerequisite for refactoring. The benefit you derived from TDD was that your initial test set actually was complete, and therefore allowed you to have the comfort level to approach the large refactoring. Is that the case or am I mis-interpreting your situation?

No you are correct. Without the tests refactoring is pretty much a no go unless it's a simple refactor and you're willing to develop as blindly as you had until that point I like to think my tests are roughly 100&#37; covered and very focused. All the unit tests focus on just the unit in question and mock their dependencies (using my own mock object library as of recently). If it can't reasonably be unit tested, or I want a layer on top of my units I throw in an acceptance test.

~graedus_dave, I probably started TDD'ing 18 months or so ago by asking endless questions here and on another forum I frequent. But I think mostly when you're learning to TDD to get off on the wrong footing and push yourself into corners testing things which are not relevant initially (I did), then eventually you discipline yourself and make your tests focus entirely on interfaces. That's when testing starts to become less painful and much more fun. I doubt many people start with TDD and do it properly initially. Even now I still occassionally put myself in awkward situations but I'm more concious of it.

I have to say, TDD'ing with JUnit and jMock probably taught me a lot about how to think about the whole "behaviour vs implementation" thing since jMock forces you to only mock interfaces which of course specify behaviour, not implementation.

Prototyping sounds like a better word for what I was describing. Could you explain how you go about doing it your way?

Mind you, I wouldn't label myself a TDD-guru by far. I use unit tests casually, sometimes I even do TDD, but it depends on the task at hand.

However, I think that the great difference between TDD and prototyping (At least in the sense, you describe it) is that with a prototype, you create a shallow, but fairly complete variant of the final application, whereas with TDD, you create a deep, but narrowly focused slice of the application. The technique let's you evolve the application gradually, while you learn more about it. It may sound a little backward at first, and it certainly takes some experience to get it right.

Also, since prototyping and TDD moves by different axes, you can combine the two. For example, prototyping is really good for user interface design, whereas TDD excels at model level design.

Originally Posted by Chris Corbyn

I like to think my tests are roughly 100&#37; covered and very focused.

You know that xdebug can generate code-coverage analysis for you, right?

Also, since prototyping and TDD moves by different axes, you can combine the two. For example, prototyping is really good for user interface design, whereas TDD excels at model level design.

That makes a lot of sense. So far I've pretty much just been doing UI design, since I was making general mockups. But now that you mention it, the models that drive the UI are pretty solid and, save for a bit of refactoring here and there, are pretty much as they will be in the final application.

It makes a lot of sense to use TDD for the development of the rest of the models, though. Did you use anything in partiular to help you learn what you know about TDD?

What I'm lost on is how to develop test cases - is it just testing each function, such as add/edit/delete; that's what I'm lost on.

I've developed a pretty decent framework, we're adopting it where I work, as well as using it on the project in my signature; however it's a good opportunity to really get what unit testing is all about. I'm reading about it, but I'm not sure how to implement it into a point and click type of web application; or in something say like an "install" procedure.

I don't think unit testing makes much sense at all unless you're already using OOP. When you are using OOP it probably feel awkard if you're writing messy OOP. If you're testing as an after-thought it will always feel painful. With OOP and testing before code gets written it will just makes your code much cleaner most likely. This is my experience on the TDD roller coaster anyway.

Originally Posted by nabeel

What I'm lost on is how to develop test cases - is it just testing each function, such as add/edit/delete; that's what I'm lost on.

As a rule of thumb yes pretty much. I tend to write one test class for each implementation class. That test class usually has at least one test method for each implemented method. But rather than being as rigid as "test each method in turn" it's probably better to think in terms of units of code. Try dissecting your application into the smallest "functional" components you can and test those. More often than not that is a method, sometimes it's a group of methods, sometimes it's a method with a particular call signature.

On the note about writing one test class for each implementation class, that's basically how I start out. But often that test class grows into some unwieldy 2000 line beast with testMethodNamesLongerThan78CharactersAndLookingSomethingLikeThis. That's when I decide it's time to refactor my test code. For now I generally take these steps:

1) Try pulling out any test coverage common to a particular set of components into an abstract test case (e.g. if you have tests for a RempoteTransport and an InternalTransport class but they both share common functionality for a "Transport" then make an AbstractTransportTestCase and subclass it).

2) If (1) isn't going to solve a long class issue, just try spotting areas which are testing functionally similar units of code and move those tests into their own class. A great sign that your doing this is when a whole bunch of your test methods start with the same long prefix. You'll often use that prefix in the new test class names and drop it from the method names.

Apart from that my test case refactoring sucks. I still write messy tests; I admit it, I'm a bad programmer when it comes to keeping my test code neat :P Something I'm working on though. I think the majority of people I've worked with run into the same issues keeping test code organised as it grows and grows.

No not off topic... it's good to know. I write mostly in the "MVC" format, just my own flavor of it which works for me, so everything is in its own class "container". I'm extremely adamant that everything be in its proper place. I hate free-floating functions, but in this 'framework' I wrote, it directs everything to its proper class which runs as a module.

But instead of writing test cases, why not just test the functionality on the front-end?

But test-cases make sense for more "processor" functions, verses UI stuff I guess. It's kinda making that distinction where it's not completely a clear-cut case of when you should and shouldn't use it.

I very rarely test a user interface with unit tests and definitely not with TDD. I hate all that (fragile) regular expression matching on generated HTML.

BUT, if that is something you've struggled with in the past, using a Response object solves a lot of issues. When I test controllers I execute them and then check various things in the response (was a redirect sent? What response code was generated?). Your response object would also contain the buffered page content you plan on sending to the browser so you can of course pattern match on it. However, front-end testing is usually done with a different set of tools.

I would like to get in to TDD, but from what I have seen so far, in order for it to work you need a full spec from the beginning. On a site I am doing at the moment, I have a definite plan, but not a full spec... and I'm not making it up as I go along, but I am making certain decisions as I go along.

How can TDD help me in a situation like this, and how would I go about using it?

@Stormrider: From what I understand, you can work on them step-by-step, test-by-test.

I'd also like to get into TDD. I (think I) understand the concepts, but I don't understand how to actually implement the tests in the code. Can anyone point me to a useful link, or just give me an example of how to implement it?

I've couldn't imagine working on an application without TDD, I recently switched jobs for that very reason!

Originally Posted by graedus_dave

I don't want to insert something off-topic here, but, what kind of resources did you use to learn how to do TDD effectively?

...

Where did you go during your learning process? I guess my main hangup is that I know how to write unit tests for code that's already written, but am not quite sure how to write them for requirements that aren't physically coded anywhere yet.

TDD can be a little difficult to really grasp and see the benefits, but once you have that "oh I see" moment, there is no looking back! One of my favourite things about TDD is the isolation. Being able to work completely isolated inside a larger application is a dream. I use Sql Server CE for data access in my tests, so even testing database interaction can be greatly isolated and it speeds of the entire suite no end.

Chris:
I split my unit tests up into different classes & projects (I'm a .NETer). I use a mixture of BDD and TDD, with IoC containers to quickly swap out dependencies between real/stubbed/mock objects, with a spinkling of UI testing with WatiN. I make a point of trying to treat my test suite with as much care as the code its testing. My biggest personal project has around 800 tests (76&#37; code coverage, woot) and with a test suite that big, you learn to keep it clean quickly!

nabeel:
What Chris said + 1 You want to break everything down into small easily tested units of code. Break things down as much as you can. TDD is a great way to forces yourself to recognise OO principles and patterns, making them easier to digest and get to know. And your code will end up far cleaner. Here's another useful book: Refactoring to Patterns

Oh, while I'm here: My little brother is jsut starting out as a PHP developer and I'm keen to get him into TDD as early as possible. I'm coming from a .NET background, so I can't recommend any tools. Can someone recommend some tools for TDD/IOC.mocking/etc. with PHP?

Looks cool May i ask you how many hours did you spend to finish this project?

Who's the question aimed at?

My project I showed test output from in the first post is not finished and probably never will be "finished". It will be "released" (published?) before much longer though. It's a mailing library I started a few years ago and I'm now almost finished on it's 4th major version (mass rewrite). In the latest version I've easily over 1000 hours. Including the time on pervious versions I'm running in many thousands of hours. But I'm an obsessive who spends most of my time writing open-source projects outside of work time. I could write more quickly but I don't like to rush on my own stuff and just enjoy the constant thought process that goes into the design.

I took a few side-steps writing this version of the library in order to make a mock object library based around a Java one I like and to write a nice AJAX & CLI test runner around simpletest. I'll probably release both of those too (the mock object library has a beta out already but the first "complete" version will have a fair few (simplified) API adjustments and feature additions.

EDIT| ~dhtmlgod, 800 tests! Sounds almost like our SitePoint test suite which has grown into the sort of size you only feel like running "all tests" for if you're pushing out major changes or making a public release :P

I tell you one place PHP does fall down with writing suites of tests. It's "too global". You've got your setup and teardown methods but you can only do so much with them. Instead of loading multiple test cases into a test suite in the same PHP process it works nicer if you fork a new PHP process to run each test in it's own little environment.

Most of the time you won't run into issues but as your test suite grows you start to hit problems. Memory exhaustion due to the way PHP parses classes was the first one I hit before I started running them in individual processes.

EDIT| ~dhtmlgod, 800 tests! Sounds almost like our SitePoint test suite which has grown into the sort of size you only feel like running "all tests" for if you're pushing out major changes or making a public release :P

Even with Sql Server CE they take awhile to run! I usually only run tests directly affected by what I'm doing. I use a test runner within VS.NET (UnitRun thats built into Resharper) and can select what tests I want to run. I can constantly rerun with tests with a shortcut, meaning frictionless instant feedback. I do also use CruiseControl.NET, so all my tests do run after each checkin.

Having a solid test suite and CI enviroment provides an amazing feeling of confidence making those "public release" moments alot easier!

Having a solid test suite and CI enviroment provides an amazing feeling of confidence making those "public release" moments alot easier!

Good god yeah. I'm not responding directly to you as such but just whilst I'm in "TDD proponent mode" I'll continue

Having your test suite, greatly covered and 100% passing gives you a massive amount of confidence at roll out time. No, it doesn't mean your app will be bug free; far from it. It means your app will do what *you expected it to do when you wrote it*. It also means that when a bug does get reported you can just throw in a (failing) test to "prove" that bug exists and then fix it so it doesn't happen again. In the long term it does mean hugely more stable code than untested (or just manually acceptance tested) code.

Are there any basic tutorials or introductions to TDD out there? I've been googling around a bit but haven't found one I liked yet. I want to get into it, but the things worrying me are code coverage (how do you know you are testing everything?), and the length of time I imagine it takes to come up with and program all the test cases.