Testing In EXtreme Programming

In one of my J2EE projects involving 4 programmers, we are trying to follow a few of XP practices. In our first iteration we have used JUnit as unit testing tool. I looking forward to functional and integration testing now once my other 4 iterations are completed. My choice is using HttpUnit for functional testing of all the modules.

But I want to know even though all the iterations work individually as they were developed, whether any malfunction occurs while those seperate modules are integrated to make the full functioning system. As far my understanding goes HttpUnit cheques functional testing, but do you think that integration testing is automatically coved if functional testing is covered. I dont know what else to test, as integration testing, if my system passes the functionla testing.

I would like to get a few highlights on functional testing and java tools to cover it.

This is actually a quite an interesting question. Here's my take on it (the role of integration testing on an XP project).

Integration testing is about integrating. If you have a team of developers working on the same codebase, residing in the same configuration management system, and the developers integrating their own changes to the configuration management system every 15 minutes, what is there to integrate?

In most cases, I believe, what's left to integrate is the infrastructure. The developers probably aren't running a cluster of application servers on their desktops, they might not use the same database product that's used in production, and they might not have a similar amount of data in that database that's in production. Furthermore, they might not have an LDAP connection to authenticate against, using a mock-directory service instead when running the application locally, etc.

To me, this is the prime motivation for doing "integration testing" on an XP project (or, actually, pretty much any project where the code is integrated frequently to a shared repository). It's not for verifying that functionality works as expected, it's for verifying that the code being developed is a good citizen in a "real" setting with all those big infrastructure products and services to integrate with.

I'm sure others will respond with differing opinions and I'm very anxious to hear about those. Yet, this is the way I see the role of integration testing.

Ashik, It does depend on how you define integration and how much your unit tests cover. It also depends on what your functional tests cover.

Before I describe what we do, note that I'm not on an XP team. We do write tests and integrate often though. We write tests that we call "unit tests" for tests that does not require a server to run. We also write "integration tests" for the back end components. While these integrate with the database, they don't really test much code integration. We also write higher level "integration tests" that test some higher level back end components. [And obviously, we could use a different name to differentiate these.]

We test the GUI by hand. I think this maps to your functional testing. [One of these days I'll find the time to try a tool like fitnesse for automating some of the functional testing.] Finally, we write end-to-end junit tests to test back-end file processing code. Since this doesn't have a user interface, it is more programmatic.

In our case, functional testing doesn't run the gamut of integration testing because not all of the code has a GUI. Even if it did, you may want to integration test for what happens if someone subverts your user interface entirely. This is hard to test with a GUI testing tool. Another thing you might want to test is performance. While this isn't strictly integration testing, it doesn't usually fall under the category of day-to-day development.

Integration testing is about integrating. If you have a team of developers working on the same codebase, residing in the same configuration management system, and the developers integrating their own changes to the configuration management system every 15 minutes, what is there to integrate?

If their tests truly make sure that the code that they are merging every 15 minutes actually integrates properly into the rest of the system, I would say those tests are actually integration tests, even if they happen to be called "unit tests" by some. I think that's the way it should be - integration is where most of the difficult bugs traditionally appear, so to maximize the benefits of automated testing, the automated tests should be integration tests.

In most cases, I believe, what's left to integrate is the infrastructure. The developers probably aren't running a cluster of application servers on their desktops, they might not use the same database product that's used in production, and they might not have a similar amount of data in that database that's in production. Furthermore, they might not have an LDAP connection to authenticate against, using a mock-directory service instead when running the application locally, etc.

To me, this is the prime motivation for doing "integration testing" on an XP project (or, actually, pretty much any project where the code is integrated frequently to a shared repository). It's not for verifying that functionality works as expected, it's for verifying that the code being developed is a good citizen in a "real" setting with all those big infrastructure products and services to integrate with.

I would call that "system testing". It tests how the system works as a whole, rather than just how some parts integrate with other parts.