Well-known agilist and TDD expert J.B. Rainsberger has begun a series of posts to explain why his experience has led him to the thought-provoking conclusion that "integration tests are a scam".

J.B. kicks off Part 1 in the series first explaining what exactly he means when he says "integration test":

I should clarify what I mean by integration tests, because, like any term in software, we probably don’t agree on a meaning for it.

I use the term integration test to mean any test whose result (pass or fail) depends on the correctness of the implementation of more than one piece of non-trivial behavior.

I, too, would prefer a more rigorous definition, but this one works well for most code bases most of the time. I have a simple point: I generally don’t want to rely on tests that might fail for a variety of reasons. Those tests create more problems than they solve.

He continues then by presenting one explanation of why programmers might often end up with [too] many integration tests. He paints the scenerio of a programmer (or team) finding a particularly defect that seems only to be testable via an integration test and then concluding they "better write integration tests everywhere" to guard against other such defects.

Establishing his stance that acting out such a conclusion is a "really bad idea", JB goes on to take the reader through a relatively rigorous mathematically-based examination of what it might take to thoroughly test a medium-sized web application using integration tests. Using these numbers ("At least 10,000 [tests]. Maybe a million. One million."), he describes how much of this project's time is lost writing and running these tests, and how most teams will ultimately react to this in a progressively destructive fashion.

Soon after, J.B. followed up with Part 2 in the series, Some Hidden Costs of Integration Tests, in which he tells an entertaining "Tale of Two Test Suites" to make his point. In this tale, one test suite (presumably composed primarily of focused object tests) takes 6 seconds to run and the other (presumably composed more of integration tests) takes a full 1 minute.

The programmer(s) working with the 6 second suite have no time wasted between hitting the "run" button and knowing the outcome of their changes:

What do you do for 6 seconds? You predict the outcome of the test run: they will all pass, or the new test will fail because you’ve just written it, or the new test might pass because you think you wrote too much code to pass a test 10 minutes ago. In that span of time, you have your result: the tests all pass, so now you refactor.

In contrast, the programmer(s) working with the 1 minute test suite embarks on a fractal-like journey of ever-compounding distractions after hitting "run". In this case, a serious combination of costs is incurred:

I need to point out the dual cost here. The first, we can easily see and measure: the time we spend waiting for the tests plus the time the computer waits for us, because we find it hard to stare at the test runner for 60 seconds and react to it immediately after it finishes. I don’t care much about that cost. I care about the visible but highly unquantifiable cost of losing focus.
...
When I write a [TDD-driven focused object] test, I clarify my immediate goal, focus on making it pass, then focus on integrating that work more appropriately into the design. I get to do this in short cycles that demand sustained focus and allow brief recovery. This cycle of focus and recovery builds rhythm and this rhythm builds momentum. This helps lead to the commonly-cited and powerful state of flow. A 6-second test run provides a moment to recover from exertion; whereas a 1-minute test run disrupts flow.

Tying back to the conclusion of Part 1 of the series, where the hypothetical programmers with the integration-based test suite choose to worry only about "the most important [x]% of the tests", J.B. closes Part 2 positing on the options available to those now finding themselves more in the "1 minute test suite" world.

Both articles contain significantly greater detail (in the articles themselves and related comments) than summarized briefly here that are well worth reading in full. Additionally, these appear only to be the start of what looks to be an interesting ongoing series of posts by Rainsberger. Be encouraged to follow along and add your wooden nickel to the discussion.

Thats the main difference between integration and unit tests right there, beyond the conceptual and ideological differences: unit tests are fast, and should be run as often as humanly possible, while integration tests are slow, and should be ran rarely.

If you run integration tests after every change, you're doing it wrong. Unit tests are almost an addition to the compiler. Its your first line of defense. Integration tests are the LAST line of defense. You still want both.

Its not uncommon for good integration test suites that really hammer your application to take HOURS to run. Thats fine. Run them at night with continual integration jobs. They'll save QA time, not development time (the later is what unit tests are for).

Exactly, you run your unit tests during a commit build and run your integration tests during the integration or nightly build.

Also, at least in some cases, the same battery of tests can be used for both, unit tests and integration tests, based on configuration. For example if you run your tests against mockups (mockup domain model or mockup service etc...) it is fast and test is limited to the concrete class or classes. However, if you run the same test against real persistent domain or real service, this is now integration test. And you save some time having to write and maintain integration tests. Of course, this is not always possible, but me and my team often search for this possibility.

Sure you could. The main point is that Unit Tests will run all the time, on each developer machine. Sometimes even after every major recompilation (or even every compilation period if the dev machines are beefy enough), while Integration tests will run only infrequently on dev machines, and usually run through continual _integration_ on dedicated machines. You'll run them as often as possible, of course, but the point is that its not going to run every 10 minutes.

How else would you test a system end-to-end? Unit Tests can only offer to cover the individual pieces, without any interaction with other parties.To have confidence that all of the pieces will work together, there have to be tests that will do that.Of course you could do that manually, but of course that would take even more time, resources and will be more error prone.

I thought the point of writing tests was to find and prevent bugs, not writing tests so that they run frequently? The latter is only great to have if the former is satisfied.

Nearly all bugs I find in my software I write originate in the integration between third-party libraries, services and layers in the architecture; or they are performance or concurrency issues. To me, the 'didn't iterate a loop properly', 'forgot to update a counter', or 'I inlined/extracted/moved refactored this method and now it is broken' bugs are mythical beasts - I just don't see them. If I know the implementation is bound to be somewhat complex I'll add test cases every which way. If it is simple I won't - and save the effort for a better test. The deciding factors are Fear and Uncertainty.

What I always fear is Integration. There's practically always some unspecified, unclear aspect of someone else's code or service, or my own if it's a different component/layer/service/library. Exercising that reasonably often is more likely to uncover it.

Yeah.. aint it typical. Yet another software "philisopher" telling ya how it is. Sadly, ending up looking like a clown. These self proclaimed "architects" are a plague of our industry. If you are a profet - make sure first.

While it's good to question certain approaches, like integration testing, I must say I have found them an excellent addition to my existing use of unit testing.

I also must say I found your initial use of the word, scam, as pretty unfair in your first article (www.jbrains.ca/permalink/239). While it's a great word to get some notice, it doesn't fairly reflect the possible benefits one can get from integration tests, as evidenced by the replies to your article.

I too have been in the position where integration tests can take excessive amounts of time, so needs review and change. However, the pro/con of this (i.e. time cost to run, and cost to develop the integration test in the first place) is less errors when running in QA, as mentioned by Francois Ward above.

To conclude, unit and integration tests serve two types of needs. Unit, for TDD to run fast, quick feedback, a development practice (i.e. as described by Mendelt Siebenga in www.jbrains.ca/permalink/242). Integration, to test the whole stack (including application container and db), and I would recommend to mainly use to test the user application (i.e. a fat gui, or web application, or web service). Yes, integration tests take longer, but one who has used integration tests knows that.

Let us use the tools and approaches that we feel are useful given our needs. And I would suggest not to use an inflammatory word like scam, as it is not a suitable word to be used in a 'balanced' debate on software testing. I worry that users that have not used integration tests up to now might read your article, and take your approach and criticisms as fact (i.e. not read the comments to see the arguments for integration testing).

To conclude, it's good to question things, but I'll agree to disagree with your take on this J.B., and continue to use integration testing for even if it adds time to the development/ci process, I have found time and time again it results in less errors found when code goes to QA.

"How else would you test a system end-to-end? Unit Tests can only offer to cover the individual pieces, without any interaction with other parties.To have confidence that all of the pieces will work together, there have to be tests that will do that."

Wrong. I have confidence that the pieces I write will work together, even without putting the pieces together. Stay tuned to the article series to learn how.

"I thought the point of writing tests was to find and prevent bugs, not writing tests so that they run frequently? The latter is only great to have if the former is satisfied."

You've set up a false dichotomy. I do both.

"What I always fear is Integration. There's practically always some unspecified, unclear aspect of someone else's code or service, or my own if it's a different component/layer/service/library. Exercising that reasonably often is more likely to uncover it."

I only fear integration when I don't understand the behavior of the thing with which I'm integrating. I can write tests to clarify that understanding and test my assumptions, then stop worrying about that, all without actually integrating with that thing.

"Yeah.. aint it typical. Yet another software "philisopher" telling ya how it is. Sadly, ending up looking like a clown. These self proclaimed "architects" are a plague of our industry. If you are a profet - make sure first."

I see. So the fact that I have done this on successful projects has no bearing? Strange.

Incidentally, if you can find one place where I proclaimed myself an architect, I'll e-mail you $500.

"While it's good to question certain approaches, like integration testing, I must say I have found them an excellent addition to my existing use of unit testing."

I have never claimed otherwise.

"I also must say I found your initial use of the word, scam, as pretty unfair in your first article (www.jbrains.ca/permalink/239). While it's a great word to get some notice, it doesn't fairly reflect the possible benefits one can get from integration tests, as evidenced by the replies to your article."

You are free to find the word unfair. I find it fair and apt, based on the number of teams I've observed drowning under the force of the integration tests they've written. I didn't choose "scam" for shock value. I believe integration tests are, in fact, a scam.

"I too have been in the position where integration tests can take excessive amounts of time, so needs review and change. However, the pro/con of this (i.e. time cost to run, and cost to develop the integration test in the first place) is less errors when running in QA, as mentioned by Francois Ward above."

Just because integration tests can help reduce time in testing doesn't mean that programmers should rely on integration tests to show the basic correctness of their code. They do that, and when they do, they waste valuable time and effort.

"To conclude, unit and integration tests serve two types of needs."

I agree and have never claimed otherwise.

"Integration, to test the whole stack (including application container and db), and I would recommend to mainly use to test the user application (i.e. a fat gui, or web application, or web service). Yes, integration tests take longer, but one who has used integration tests knows that."

Just because we know that integration tests take longer doesn't mean programmers should write them to show basic correctness of their code.

"Let us use the tools and approaches that we feel are useful given our needs."

I do, and I imagine you do. I believe we are free to have different opinions on that.

"And I would suggest not to use an inflammatory word like scam, as it is not a suitable word to be used in a 'balanced' debate on software testing. I worry that users that have not used integration tests up to now might read your article, and take your approach and criticisms as fact (i.e. not read the comments to see the arguments for integration testing)."

I appreciate your suggestion, but I decline it. I find the word apt, and therefore suitable.

I hope that people who haven't already become addicted to integration tests read my articles and avoid the addiction in the first place.

"To conclude, it's good to question things, but I'll agree to disagree with your take on this J.B., and continue to use integration testing for even if it adds time to the development/ci process, I have found time and time again it results in less errors found when code goes to QA."

Naturally, you have the right to do what makes you happy and productive, just as I have the right to claim that you're wasting valuable time. I believe I can justify my claim, and when I'm not busy traveling, I'll have more energy to write more on the subject.

"It is not just a matter of frequency, it is a matter of usefulness, it is easy to say "do not write integration tests, they are scam" and then not offer an alternative solution."

Let me explain why I haven't divulged my solution yet.

In 2003-2004, I wrote "JUnit Recipes". I intended to write a 400-page book, and instead wrote over 700 pages (Scott Stirling wrote the rest, and did a great job, too). I know the reasons why I did this, and I swore I would write more concisely in the future.

I have set for myself the challenge of writing this article series in short bursts so as to keep my writing focused and concise. As a result, I avoided writing everything I have to say on the topic at once. That would have run to around 50k words, and no-one would have taken the time to read that.

I offer an alternative solution in my tutorial "Integration Tests Are A Scam", running again at Agile 2009 conference this summer, as well as in my public and private (in-house) courses on test-driven development and techniques for promoting modular design. I have started to try to offer my alternative solution in this series of articles, and since you're reading them free of charge, perhaps you could extend me the courtesy of some patience.

Is your profile up-to-date? Please take a moment to review and update.

Email Address

Note: If updating/changing your email, a validation request will be sent

Company name:

Keep current company name

Update Company name to:

Company role:

Keep current company role

Update company role to:

Company size:

Keep current company Size

Update company size to:

Country/Zone:

Keep current country/zone

Update country/zone to:

State/Province/Region:

Keep current state/province/region

Update state/province/region to:

Subscribe to our newsletter?

Subscribe to our architect newsletter?

Subscribe to our industry email notices?

You will be sent an email to validate the new email address. This pop-up will close itself in a few moments.

We notice you're using an ad blocker

We understand why you use ad blockers. However to keep InfoQ free we need your support. InfoQ will not provide your data to third parties without individual opt-in consent. We only work with advertisers relevant to our readers. Please consider whitelisting us.