Integration testing demonstrated – level 200

In this
post, I’ll talk about and demonstrate integration testing.If you are just starting out with integration
testing, you want to test small before you test big.Full-system tests are good, but if they fail,
they don’t give much of a hint as to where the failure is.Smaller integration tests will help narrow
the area where the failure lies.

A few
rules to live by

·An
integration test must be isolated in setup and teardown.If it requires some data to be in a database,
it must put it there.Environmental
variables should not cause the test to fail randomly.

·It
must also run fast.If it is slow, build
time will suffer, and you will run fewer builds – leading to other
problems.

·Integration
tests should be order-independent.It
should not matter the order you run them.They should all pass.

·Feel
free to make up rules that objectively result in fewer bugs.

Testing
a custom SiteMapProvider

In my
example, I have a custom SiteMapProvider (PageInfoSiteMapProvider).This site map provider gets it’s data from
inside my EZWeb application, specifically, the IPageConfigProvider interface.I use StructureMap for service location, so
one of the things that an integration test will validate is that my interface
implementations can be resolved correctly.I’m going to focus on an integration test for one method on the site map
provider, FindSiteMapNode(url).

This is
simple enough.The site map provider is
just a wrapper around the interface call.Notice in the constructor, that I’m using a call to StructureMap’s
ObjectFactory class to resolve the interfaces that I need.I need the current HttpContext and some stuff
in the web.config file.Obvious in my
integration test I don’t have the ASP.NET runtime, and I don’t have the
web.config file, so I’ll need to simulate thing (mock, stub, fake, whatever you
want to call it).In my integration
test, I’m going to have to use fake implementations of these interfaces.

Here
is the integration test fixture that tests this provider all the way down to
the point where it reads the data from the xml file on the disk.I’ve chosen this scope because it’s not too large, and it’s
not too small.

This is
my entire integration test to make sure that the SiteMapNode object is put
together correctly. Notice that I have data setup with the xml file.Then I call the provider and assert on the
results.

Faking
the “cruise missile” with StructureMap

If my
code was in charge of launching a cruise missile correctly, then I would have
to fake out the cruise missile when testing my code.In fact, my code might not ever run in its
true environment until the world plunged into war again.On a small scale, I have to fake the two
interfaces that are not reproduceable in my test context.Direct your attention to the Setup
method.You’ll notice that I’m creating
two fake instances and instructing StructureMap to “InjectStub”.Because of this, when my code asks for an
instance of one of those interfaces, the class I created in Setup will be
returned.

These
classes are simple enough.They merely
return expected environmental settings so that I can test my code.

Integration
testing is hard at first

If your
style of coding isn’t loosely coupled, then it will be difficult to do
automated integration testing.Seams
must exist in the code where fake implementation can be inserted.In my case, I had a web.config file and the
HttpContext of the web application that get in the way.I stub these out for my test.My tools list for integration testing is:

·NUnit
for organizing and running the tests.

·StructureMap
for resolving interfaces and for stubbing interfaces at hard boundaries.

My way
or the highway
Just kidding.I’m not saying that this
is the “golden” way of doing integration testing.There are many ways, and that’s why software
engineering is engineering and not assembly-line work.The above approach has worked well for me,
and I put interfaces where they make sense (everywhere) in my code to maintain
flexibility.I’m open for critique or
questions.

Related

Brian
4:56 am on May 13, 2006

one of the issues I have with integration testing is how long the tests take to run. the more encompassing the tests are the longer they take to run.

i currently use nunit to test my code, and find it tedious to break out my unit and integration tests.

for any given project i might have an additional project that is a “test project”. within that project i’ll have quick unit tests and long running integration tests.

how would you go about distinguishing the two so that only the unit tests are run?

Brian,
I have one project for unit tests and another project for integration tests. This makes it easy to keep them separate. You _can_ put them in the same project, but it requires more discipline to keep your unit tests separate from you integration tests. With NUnit, you can label unit tests with a unit test category and the integration tests with another category. NUnit will allow you to run tests by category. The TDD.net add-in can also run unit tests by namespace.

It’s easy to let an integration test run slow, but you must keep them running fast. If the tests are slow, then that may be indicative of a slow system.

JBarbatos
2:08 pm on May 14, 2006

Thanks. Any chance of you showing an example of an acceptance test using Fit?

Your examples have been great, I have been getting a better understanding of how all this works together. Now I just need to come up with some samples of my own to show the “leaders”, so we can hopfully start using these methodologies on our new projects.