How can I motivate coworkers to write unit tests?

Making a dreary but important task commonplace will require some creativity.

This Q&A is part of a weekly series of posts highlighting common questions encountered by technophiles and answered by users at Stack Exchange, a free, community-powered network of 90+ Q&A sites.

lurkerbelow is the only developer at his company writing unit tests. Management, developers, everyone says they want to write unit tests, but nobody does. To bring developers into line, lurkerbelow has introduced pre-commit code review (Gerrit) and continuous integration (Jenkins). Not working. "How do I motivate my fellow coworkers to write unit tests?" he asks.

Practical deomonstrations help

I noticed that talking about TDD hardly works. People like to see raw results. Saying that "writing tests will reduce development time" is most likely true, but it might not be enough to get anybody convinced.

I was in a similar position (well, not as bad as yours), and it kind of resolved itself when people started working on my code (note: my code was unit tested, others' not so much). When something stopped working, natural follow-up after local investigation was to ask me what could be the reason. Then we sat, we ran unit tests and saw what happened. If tests were passing, most of the time problems were in the new, untested code. If not, tests were usually able to spot the problem (or at least point us in the right direction). We fixed the bug, tests were up again, everybody was happy.

Long story short, a few situations like this transpired and 2 more developers became TDD/testing enthusiasts (there are still a few more to go, but it looks promising).

As for advice, you can give a go with TDD kata; a simple task to implement using a test first approach as opposed no test. Depending on how complex the task is, non-test approach should usually be slower, especially with incremental required changes:

Get to the bottom of it

Tight dev schedules are often the reason, but you say you don't have that.

Next reason is that with a large existing untested code base, writing tests probably seems overwhelming, a never ending job (like laundry and about as exciting). So human nature is to think, "that's too much to face. I'll skip it."

Another reason could be that while they think tests are a good idea, they are not confident about how to start writing them, especially if they never worked anywhere that wrote tests.

Another strong possibility is that they really don't see any value in additional work, even though they are giving lip service to the idea.

So how do you handle the different reasons?

Reason 1 is simple: show them an example of how it saves development time.

Reason 2: show them how many tests you have written in a year and what percentage of the code base it covers. Do the math to show how many more tests they could have at this time next year if they do this. Once they see that little bits of progress on a daily basis really add up, the whole idea isn't so overwhelming. If you can, pull the data out of the system, show them how many bugs were repeat bugs in the untested parts of the code and how many repeat bugs appear in the code with unit tests.

Reason 3: training, not just showing. Make them write tests in a training class.

Reason 4: this is the crux of the matter. First, pick a pain point, one of those bugs that has returned multiple times. When that comes in, this is the time to suggest to management that if they had unit tests on this item, it wouldn't keep coming back like a bad penny.

Another way to address reason 4 is to have management make it part of the process and mandate that code doesn't pass code review unless the tests also pass code review. Best to approach them with making this a policy right after one of those pain points or preferably right after you have had several in a matter of days.

We all like to think that as developers we self-manage (LOL), but the ambitious will care about what managment emphasizes. And the professionals who really do self-manage are already writing the tests. If you can't get management to care about best practices, then you will be fighting an uphill battle all the way and, you might want to assess if this is the right corporate culture for you.

Something for (almost) nothing

I would start by demonstrating the benefits of TDD. Try to showcase benefits of unit testing.

As normal human beings, developers are motivated by benefits. They don't want to do things that just create more work. Unit testing means less work. It means going out with friends more. It means having more fun because you don't have to spend every night coding till 11pm. It means going on vacations more with a peace of mind.

One of the biggest benefits of TDD is that you can refactor your program to a better design or just change the name of something ... and as long as that design doesn't break the tests, you can have 100 percent confidence that your change didn't break anything.

Another great case for TDD is creating unit tests for legacy code. This would represent one of the best ways to start refactoring out the evil. In the long run, this will serve to improve your knowledge of the code base, understand its strengths and flaws, spot hard-coded business logic in the code and give you a good start to improve the quality moving forward!

60 Reader Comments

You would have to do a bit of educating, complete with real-world examples using your existing codebase to hit home.

In college we had to do unit tests, but none of them were ever actually good for anything. 2 years of writing Java code and not a single instructor has ever spoken to me about how to use them effectively. Aside from performance, I don't know how to test for anything that an exception wouldn't already tell me.

I'm not saying this is OK, the first chance I get with a project I'm going to figure this out for myself, but I am saying that the people who trained me showed me how to do it more as a formality rather than something that has real purpose. I suspect that most people are in the same boat.

When I break your code at 2:00am on a Sunday trying to deploy my new feature, would you rather have a Unit Test that caught it before I checked in, or get woken up by Ops asking why your stuff isn't working all of the sudden?

Many programming problems in real life can be helped by breaking things down into simpler steps until the steps are understandable to the recipient -- either a coworker or the programming language of choice. However, I feel that most unit testing proponents seem to lack that breakdown. "Just do it" may be appropriate as an advertising slogan and sound fun to say. But, to a novice with zero to little *practical* experience (as Robo_Leader says), "just do it" sounds like "here's a unicycle to ride on a tightrope over the grand canyon. While juggling chainsaws. Just do it." Even saying "it prevents bugs" makes the "no silver bullet" chapter in The Mythical Man Month pop into my head.

A realistic approach for testing fanatics is to take a step back and help break down *all* of what they do into smaller steps until others can understand. If your audience can't understand what you're saying, then there is a failure to communicate. Merely pulling the "or I'll fire you" card doesn't actually communicate concrete programming steps -- which is what you want your developers to actually complete.

If you have the power to fire them, exercise it. Otherwise, report it to management and let them handle it.

Note: edited because this guy isn't actually the boss, like I assumed.

Start putting their heads on spikes, that'll teach um.

So its wrong of me to expect them to do their jobs?

"Their jobs" are what management incentivizes them to do.

If management tells them to do unit tests, but doesn't recognize unit test writing as productivity during performance reviews, they're not going to write unit tests.

Management must buy into quality being part of productivity. That's easier said than done. Non-technical management can only judge by what they see, and that tends to mean visible features get rewarded and prioritized.

A few questions you should probably ask:Does everyone who wants unit test understand that it will cause projects to run longer, especially at the start?

It's all very well saying management wants unit test, but if they are also pushing developers to get things finished, and give no extra time for unit testing then it is pointless to try to introduce it.

Why do you want to unit test?

It seems like a simple question, but if you want to introduce it because it is what all the cool kids are doing and you have a book that tells you all about it then it is a bad idea. Maybe just introduce it gradually until you fully understand the implications. Trying to get everyone to do it at once can lead to disaster.

If you have the power to fire them, exercise it. Otherwise, report it to management and let them handle it.

Note: edited because this guy isn't actually the boss, like I assumed.

Start putting their heads on spikes, that'll teach um.

So its wrong of me to expect them to do their jobs?

"Their jobs" are what management incentivizes them to do.

If management tells them to do unit tests, but doesn't recognize unit test writing as productivity during performance reviews, they're not going to write unit tests.

Management must buy into quality being part of productivity. That's easier said than done. Non-technical management can only judge by what they see, and that tends to mean visible features get rewarded and prioritized.

It's still easier said than done, but if management is on board they should tie department budgets to qa time for a specific application or set of code. Most code, if written well with good unit tests should have fewer defects. This will give the developers and managers incentive to ensure they are written.

But the biggest factor I think needs to be addressed is the will power of management. Even if they agree with it, they are the ones with the authority to require it. If they aren't saying its a requirement its not getting done.

Like anything else, unit testing and TDD are techniques at your disposal. They aren't things to shove down your coworkers' throats. If they help you be more productive than the others, the ones that care will start asking you how you do it and them mimic your success.

Some programmers aren't that great at test writing. Even if you get them to write them, they'll probably be crappy tests that test all the obvious things, but completely fail at testing edge cases and special conditions, until after those are exposed by a bug. Unit tests are only a tiny fraction of all "necessary" tests for production software anyway.

Larger software companies hire developers specifically geared towards test. They write and run unit tests, setup build systems with proper reporting, construct testing tools for scenarios where simple unit tests don't work, and generally both enjoy and are good at devising adequate tests. You need some of these people. Microsoft (and a number of companies following their lead) calls these people SDETs (Software Development Engineer in Testing, iirc), which is a term to consider putting on any job postings you make to increase visibility to the right people

Finding someone who is good at that kind of thorough testing, designing maintainable architecture, optimizing for end-user needs, devising algorithms, fixing complicated bugs without breaking everything else, having a good knowledge base and set of resources to avoid wasting time and money, getting this all done on schedule, and is a pleasure to work with is certainly possible, but not easy, or cheap; these devs are worth a lot and know it. You're not going to have a team of people like this. Instead of trying to pretend you'll have a team of perfect devs, assume you won't and aim for people with the necessary specialized skills. A varied team is easier to work with than a group of so-called "super devs" anyway. Especially once you get the process down.

You can certainly require unit tests for new modules during code review, but that's about it. Again, they'll likely be crappy tests. You're better off requiring documentation on new modules (or at least functional specifications) so the SDETs have it for devising proper tests than requiring incomplete, fallable tests on otherwise undocumented modules.

I didn't even read the article. I just wanted to immediately put in my 8 cents. Here's a good way to motivate people to write unit tests: challenge them all to a contest to see who can break the software. Award 1st through 3rd place. Black and gray hats on the team will probably love this idea. Seriously, who doesn't like to break shit? "Hey, buddy, here's our internal API. Do whatever you can to break it using only the API methods." Then you'll probably have the entire engineering team sign up.

- Management buy on unit testing.- Scheduling takes into account time to write tests.- Code is structured to allow use of mocks- Code is broken up to allow testing of just one object

Without this it is just pointless.

It's still worth it if you're not set up to use mocks. 50% code coverage is infinitely better than 0% code coverage. Picking up legacy code, there's never going to be time (or buy in from management) to bring it from 0 to 100% code coverage by mocking out dependencies and such. Unit testing up to 50% code coverage will probably catch 90% of the bugs. Then you just keep adding unit tests for regressions.

For new code, definitely keep mocks in mind at the design phase.

People who don't write unit tests usually have some perl script or *shudder* .bat script to drive their code anyways. Just tell them to use the Unit Test tools instead and gradually ratchet it up to higher and higher test coverage.

When I break your code at 2:00am on a Sunday trying to deploy my new feature, would you rather have a Unit Test that caught it before I checked in, or get woken up by Ops asking why your stuff isn't working all of the sudden?

Why would you break someone's code by adding your feature?

Aren't YOU supposed to write good code and to test your changes locally before checking in?

Furthermore, isn't there any check-in policy and code review process in your company?

Finally, don't you know that potentially breaking changes are not done at 2:00am on Sunday or on a Friday afternoon?

Your people refuse to do the job they were hired to do. The answer is simple. Blisteringly bad reviews all around and followed by terminations. Keep fire staff util they bother to do their job or until all not performers have been replaced.

I do find the proponents of unit testing are incredibly immature both in the argument and the manner in which it's put.

"your code breaks when I deploy my feature at 1sm sunday"... unbelievable! Code gets deployed at a more sensible time, to a QA or CT or Staging area well before deploy to Live, and if your feature breaks anything, its your fault - you do not work in isolation from the rest of the team (or if you do, no wonder "everyone else's" code breaks....)

"give bad reviews then fire them".... amazing. first, if its not part of the job as laid down by management and properly communicated to the staff, and training provided, then firing will only be a massive fine at the employment tribunal for unfair dismissal. That's the best case, worst case, management fires everyone and outsources all development to Elbonia.

It adds up the the fact that unit testing is just the latest thing the "cool kids" are doing/want to do. Professional developers have been testing their code for decades - or did you think code written years ago could never work - and unit testing in its current form was never used. Sure, coders of old used test rigs, tested chunks of code in isolation with their own mini programs, or just embedded test assertions in the code when built in debug. These methods worked fine, making unit testing as we know it today a bit redundant - ie there are better ways of testing that do not require a unit test framework and an artificial approach to designing code that requires patterns such as IoC and mocking-friendly interfaces to work.

Personally, I find that unit testing today is generally corrupted to fit with a framework's view of how to make it work. All tests seems to be more geared around auto-generated stubs on a per-method basis (you see this in some questions that ask whether you need to unit test private methods too). This isn't the original intention of unit testing. That tested units of code, ie anything that can be considered in isolation like a class or a whole component of the overall program. This way allows you to write your tests first, and then use them to verify what you write fits the specification - you cannot do that if you're testing each method, that's far too tied to the implementation. I find that what we should be doing is ignoring the TDD tooling and going straight to BDD tools that are much more in keeping with what unit testing was all about.

Put it this way, if you can use your "unit tests" to create documentation on your code, then you're doing it right. If you have a bunch of methods that test a tiny specific feature without any relation to the rest of the component, then you're doing it wrong.

The problem I have seen with test driven development is that it needs the thinking to be done beforehand, which is vastly different from how many people actually program.

In theory, you should have an abstract idea of what a class does. Then you decide how the interface looks and write a test for the not yet existing code to make sure the class performs as it should from the outside view. Then you actually implement the class.

In practice, people develop the structure of their code largely while they write and try, not separating the actual implementation from the more abstract semantics of their code. If they write a unit test beforehand, they will have to alter it anyway after having finished their implementation because they didn't think of many things beforehand. So they will write the test second in the future.When they first implement and then write tests, they tend to write tests that reinforce the assumptions made in the code, overlooking the same edge cases and maybe additional semantics they also forgot to implement.

Then TDD doesn't work and they're dissatisfied for it being a waste of time.

I would mandate, for this and to motivate them, to not write their own unit tests, but have them write unit tests for someone else's code. Give them the documentation for the foreign classes, not the code, and tell them they should for one use it normally and then try edge cases.If they have do it so someone else can finish their work, because unit tests are now mandatory, they will have a stronger incentive and you will have code that is way better tested.

I don't mind writing unit tests. But management views it as a waste of my time. So I don't write as many tests as I should, and the ones I do write are rarely adequate.

Management believes there's more value in launching earlier with broken features than launching later with extensive tests and a more reliable product.

I wonder what Ivory Tower person gave the down vote. What the was said is completely true. I've worked on 2 projects that used unit tests, and one of them stopped half way through. Why? Because in the eyes of "you have 6 weeks to do 6 months of work" business, testing and QA are profit drains.

But how much does it cost to fix those bugs later, you ask. Surely it ends up costing much more than getting it right the first time. To the logical mind, this is true. But to the MBA moron, who lives in Bizarro land, its not. And the even stranger thing is, once reality proves the immense value of unit tests, among other sane practice, they *still* make the same short-sighted decisions time and time again.

Back on topic, if you work somewhere that testing is actively valued, then some education is in order for the developers that don't see the value.

Dear lurker below, bad news son. You. Are. Not. The. Boss. Get the hell over yourself, STFU and do your job already. Ultimately, yours is not a technical issue but a question of being socially/emotionally maladjusted. You feel personal dissatisfaction performing within the proper role on a team and an inability to deal with not having the a desired level of authority in your workplace. You need a therapist, not advice from a tech forum.

Your approach is not working because your idea is viewed by everyone else in your company as, ultimately, stupid. Journalists at places like Ars Techica obsess over things like unit testing as a way to appear hip in a technical field they genuinely know little about - it's like something they remember from that one class they took with that one professor who seemed really, really smart. As such, everyone has to pretend it is oh-so serious business (and in some instances the approach genuinely does improve efficiency and generally kick ass) regardless the reality of applicability in a given development situation.

So, your first mistake is confusing bowing to a social pressure to avoid the spittle-flecked tirades of a Unit-Testing fanboy with actually agreeing with you about the best approach. Back in the day, we'd have to do the same thing when the shop would get a new Linux fanboy jabbering on about how whatever it was was better on Linux; "Yes, yes ... you are so totally right. Linux rocks the universe. Go Grub! Now, can you get back to working on your .net component, thanks".

In a nutshell, nobody else in your company with a primary concern of creating software - most significantly, your bosses - genuinely sees enough value in doing what you want them to do that they'll do it. As an alternative to spinning your wheels, one could consider that when an entire building full of equally experienced - daresay in some cases perhaps moreso - professionals refuse to buy in to your approach, they very well could be on to something. But even if by some magic roll of the dice you are a fucking genius in a sea of incompetent fools ... being right doesn't matter one whit.

Here's the truth. You've been bitching at your coworkers and anyone else who comes within earshot about this bullshit for at least 9 months now. Any imagined productivity gain that may have been realized from unit testing was eclipsed long ago by the overhead of having to deal with your whining and trying to implement a who's who of increasingly asinine academic hoops to force coworkers to jump through - all in the name of efficiency. What started out as a great idea is now a clusterfuck and a disaster. If you don't shut the hell up and just let the team write some flipping code in peace for a bit, you won't have to ponder the idea of quitting from a terrible boss - the terrible boss is going to fire your ass.

I'm currently dealing with this issue. I'm not the QA manager, that is a different department. But I am QA liaison for the software engineering team. Part of the problem we have is we use a variety of languages C and Java, which is a problem because the unit test libraries are nothing alike so they need to learn both.

What I am trying to do is convince them that the unit tests they write will be of immense value in the future. There is no to little value when the code is being developed unless you're pretty inept. The true value of a unit test is realized 6 months or more down the road when someone else is modifying the code you wrote. Our developers to pride themselves in bug-free-ness and the presence of a unit test puts the blame on the other guy. Unit tests are defensive in nature. TDD/unit testing are taught as a way for the organization to gain a defense from developing quality issues in the future. We do it for the business entity. But that's why unit testing initiatives fail - it's never made personal to the programmer.

So what I would do is use a quality source control system and when issues are found fault is assigned. At salary/bonus/performance review time, have that as an item. Code breakages by others in code that you developed is the key thing to measure here.

As someone pushing TDD, you've got to develop a simple unit test system. I'm currently arguing for a LUA interpreter to be enabled when in debug mode so we can code test scripts on our system. Of course, the LUA test code will eventually be automatically generated by the SCM, at least to some degree, It'll generate the prototypes for the functions modified, and the developer will just have to elaborate on what legal returns are for permutations of input.

My QA manager has a hard on for this idea, but I think we'll have to hire someone to do enumerate the unit test cases and work with the developers. We'll probably get an Indian to do it.

One of the biggest benefits of TDD is that you can refactor your program to a better design or just change the name of something ... and as long as that design doesn't break the tests, you can have 100 percent confidence that your change didn't break anything.

This is one of those statements that makes people wary of everything else a testing advocate says.

Tests give 100% confidence in exactly one thing: that the tests pass. That's it. They're a strong *indication* that things are working as intended, especially when the tests have been around a while, but there's no way that, outside of a truly simplistic system, that your tests are 100% exhaustive.

I've definitely become a testing advocate over the last couple years (although I still generally avoid unit tests in favor of larger integration tests), but these sorts of demonstrably wrong statements meant that I avoided listening to other test advocates for quite a long while.

Don't expect developers who don't write unit tests to start writing unit tests. If you want test-driven development, then you need to hire test-driven developers. Some developers are able and willing to assimilate when they join an existing team of test-driven developers, but one developer is not going to transform the rest of the team. That doesn't happen.

One of the biggest benefits of TDD is that you can refactor your program to a better design or just change the name of something ... and as long as that design doesn't break the tests, you can have 100 percent confidence that your change didn't break anything.

This is one of those statements that makes people wary of everything else a testing advocate says.

Tests give 100% confidence in exactly one thing: that the tests pass. That's it. They're a strong *indication* that things are working as intended, especially when the tests have been around a while, but there's no way that, outside of a truly simplistic system, that your tests are 100% exhaustive.

I've definitely become a testing advocate over the last couple years (although I still generally avoid unit tests in favor of larger integration tests), but these sorts of demonstrably wrong statements meant that I avoided listening to other test advocates for quite a long while.

It adds up the the fact that unit testing is just the latest thing the "cool kids" are doing/want to do. Professional developers have been testing their code for decades - or did you think code written years ago could never work - and unit testing in its current form was never used. Sure, coders of old used test rigs, tested chunks of code in isolation with their own mini programs, or just embedded test assertions in the code when built in debug. These methods worked fine, making unit testing as we know it today a bit redundant - ie there are better ways of testing that do not require a unit test framework and an artificial approach to designing code that requires patterns such as IoC and mocking-friendly interfaces to work.

Personally, I find that unit testing today is generally corrupted to fit with a framework's view of how to make it work. All tests seems to be more geared around auto-generated stubs on a per-method basis (you see this in some questions that ask whether you need to unit test private methods too). This isn't the original intention of unit testing. That tested units of code, ie anything that can be considered in isolation like a class or a whole component of the overall program. This way allows you to write your tests first, and then use them to verify what you write fits the specification - you cannot do that if you're testing each method, that's far too tied to the implementation. I find that what we should be doing is ignoring the TDD tooling and going straight to BDD tools that are much more in keeping with what unit testing was all about.

Put it this way, if you can use your "unit tests" to create documentation on your code, then you're doing it right. If you have a bunch of methods that test a tiny specific feature without any relation to the rest of the component, then you're doing it wrong.

I just can't agree with this. Any methodology can be misused and produce bad results. If you force a methodology on a developer who doesn't want to follow it, I can see such results arising. However, when a developer devises a strong set of tests because s/he *cares* about maintaining it over the long haul in the face of major code changes, there's nothing but good things that come from that.

I had adopted the anti-unit-test mentality for the same reasons for years, and in the end I found that writing code that way simply doesn't scale for the kind of projects I work on. I expect a certain behavior of N lines of code written. Having to reverify that behavior takes too much work. The number of times I've inadvertently broken tests tells me there's tremendous value in the practice.

That said, I don't think anyone would claim that even a strong set of tests is a substitute for other forms of testing. But to say unit tests are useless is to say that a developer shouldn't write them, which I deem kind of silly to suggest.

What changed my perception about unit testing was, it's all about designing good code. But it require you to write tests as you work on the design, not later. It's easy to skip writing unit tests if you believe you can postpone it, say the benefits only arise if code breaks in the future. This year I've started to apply TDD (Test Driven Development) and TFD (Test First Development). Because I find immediate benefits in that my own code become better designed, I am not depended on everyone on the team writing unit tests and have perfect code coverage at all time. I can just go ahead as an example, and hope other people also see the short term benefits I've learned to appreciate.

The "Fire them", or the "they are incompetent" comments are a bit over-board. If people aren't writing unit tests, it's probably because they are not being considered as part of the yearly review metrics. If management don't value tdd, and if senior developers don't value tdd - there isn't much to do.

There also are alot of businesses where the timeline for releasing projects is so tight, that taking even four-eight hours a week will cause the project to fall behind. If you held up the release because your code wasn't complete (but was in very good shape) versus other developers who have a finished product that's buggy - who ends up looking bad? Your manager will blame you - and that person usually is getting blamed for a poor release by people above them. When a 200 hour project is pushed into 100-150 hours, you know what get's tossed out the window first? QA. Business/managers usually just want to have SOMETHING to hand off as done, regardless of whether it's being done well. Showing nothing just looks worse than a crappy product.

I've been on both sides of this issue in several different industries.

The answer: you can't make or persuade your co-workers to do unit testing, especially in a hostile or disinterested corporate environment. It's simply against their interest to do so. If your company is dumb enough to think that testing is counterproductive, they'll find out in the worst possible way.

For example: most developers think that code commenting is a waste of their time - after all they could be doing more code in that time and besides, their code is fine and they understand it completely... Companies tend to go along with this (or aren't aware it is happening) and things are fine until boy genius moves on to his next dream job and six months later someone else is trying to figure out how his code works cause it's just crashed the entire system after the latest upgrade...

Unless and until your company management decides that unit testing is worthwhile, it simply won't happen.

If and when it does, it will be very simple: some manager with the power to enforce compliance will send an email, on some Monday morning, saying: "Drop what you are doing and create a testing program for your unit, have it on my desk by Friday."

Then the next Monday another email will appear: "Show me the results of your unit test, have it on my desk by Friday."

The "Fire them", or the "they are incompetent" comments are a bit over-board. If people aren't writing unit tests, it's probably because they are not being considered as part of the yearly review metrics. If management don't value tdd, and if senior developers don't value tdd - there isn't much to do.

There also are alot of businesses where the timeline for releasing projects is so tight, that taking even four-eight hours a week will cause the project to fall behind. If you held up the release because your code wasn't complete (but was in very good shape) versus other developers who have a finished product that's buggy - who ends up looking bad? Your manager will blame you - and that person usually is getting blamed for a poor release by people above them. When a 200 hour project is pushed into 100-150 hours, you know what get's tossed out the window first? QA. Business/managers usually just want to have SOMETHING to hand off as done, regardless of whether it's being done well. Showing nothing just looks worse than a crappy product.

A factor you aren't mentioning is that when someone has invested the money to fund development, typically there is a real-world need for whatever the funded software accomplishes. Typically, this real-world need directly relates to producing the revenue required for maintaining a viable business enterprise.

Software which is able to accomplish the needed revenue-producing functions of a company can be the difference between viability and bankruptcy. With revenue, a company can *always* hire a developer to improve code ... without revenue, the prettiest code in the universe is worthless. For anyone who steps outside the bubble, it seems like a no-brainer to get the money rolling. Sorry if a site tech has to get up at odd hours because their idiot coworker decided to push a broken update at 2 am on a Sunday (?) ... but that's why the tech is getting paid. At the same time, the system being online and producing revenue is likely the exclusive reason there is the *money* to pay this tech at all.

Generally speaking, the person who sets up a contract usually isn't an idiot. Often, this person actually does know the business quite well. Generally speaking, when an account is handled by a professional who knows the business, the client will be well aware that they are making trade-offs. This is true both in private contracting as well as when the "client" is a department within a bigger organization which a development team services.

Figuring out this balance is WHY there are managers. It's a business question as much as it is a technology one.

I think the most annoying thing about the two-dozen or so times Ars has presented this exact question - with pretty much an identical premise - is that not once have they ever approached the question from a perspective that the interests of the business paying for development might be *the* primary consideration in how development should be carried out.

The way Ars has been approaching this (and similar) questions always adopts a position of inherent and deep disrespect for the needs of both the client and the decisions of management staff who's focus is - correctly - on servicing the client's needs.

Quality automated testing (with continuous builds) is the single best "silver bullet" I've learned that has improved the development process in 24 years of programming.

It sounds like you're going for a grass roots approach to adopting automated testing, instead of an edict from on high. Cool. So I'd say that human psychology being what it is means that if you make it easy and routine, developers will start to use it all the time and will learn to love it and become their default "what do I do next" process for building code.

My suggestions:- Make the test system super easy to run and get results out of. For example, if a test failure means logging into a different system to get logs, and then yet a different system to view outputs you should automate that so things just arrive on the developers machine (as stdout or a web page or wizards within your ide).

- If possible, make tests run fast enough so that the test run is part of the build command itself. Also, if some of your tests do run slow, I'd suggest creating test categories. This way you can have the "smoke test" set of tests that all run quickly enough and then you also have the "deep" test suite that takes longer to run but is more thorough.

- Provide 1 - 3 small, yet common-execution-flow "starter" tests. You obviously don't want people to cut and paste code (ever!) but giving small concise code snippet examples means someone new to the team can read these small code examples, and use then as a starting template/idiom for how to test, and then the code becomes the "documentation truth" for the testing system.

- If you're do stuff that has a web component, check out Selenium for automated driving of your browser.

- A note of caution, automated tests do take active work to maintain. Twice I've seen testing get a start in a group but then fall away because of the "gotta ship this now" effect and some portion of the tests had fallen out of date (features always evolve and change!). Tests start failing but the system itself was good and no one wanted to figure out and fix the root of the failure. Actively tending to your tests, and moving them forward is a pain, but it becomes soooo incredibly worth it IMO.

- I don't currently do this, because my own team has the religion, but ideally you'd also record stats. Numbers always make compelling arguments (especially to higher up and other teams) Record how many times a test failure illuminated a real error in the system (and at which stage - writing new code, or merging code together, or due to system or library changes). These kinds of stats are very powerful. Now that I'm thinking about this, I'm going to build some addons to do just this, and I'll make them part of our TeamCity setup.

Here's the truth. You've been bitching at your coworkers and anyone else who comes within earshot about this bullshit for at least 9 months now. Any imagined productivity gain that may have been realized from unit testing was eclipsed long ago by the overhead of having to deal with your whining and trying to implement a who's who of increasingly asinine academic hoops to force coworkers to jump through - all in the name of efficiency. What started out as a great idea is now a clusterfuck and a disaster. If you don't shut the hell up and just let the team write some flipping code in peace for a bit, you won't have to ponder the idea of quitting from a terrible boss - the terrible boss is going to fire your ass.

Quoted for truth.

I am really starting to hate testing evangelists with passion -- some of them appear like religious zealots by trying to force their views and methods on everyone.

I am pretty sure that Microsoft Skype team uses TDD and Agile if for no other reason because that seems to be "in" at the moment. That didn't stop them from shipping broken Skype releases like 6.1 which stays running and using CPU when you select Quit, or the one that was crashing on Mac.

We are at the point where the skill of writing good, reliable code is made obsolete by batteries of tests of dubious value being run by a bunch of monkeys on an extremely bad and unreliable code they wrote without even realising that if they cannot catch an error while writing the code they won't be able to design a test that will catch it either.

To me, that doesn't make any sense, especially if you take into account how the code quality has actually detoriated over the last decade instead of being improved by testing. Based on that, I dare to say there is something wrong with the approach we have been taking.

One of the biggest benefits of TDD is that you can refactor your program to a better design or just change the name of something ... and as long as that design doesn't break the tests, you can have 100 percent confidence that your change didn't break anything.

This is one of those statements that makes people wary of everything else a testing advocate says.

Tests give 100% confidence in exactly one thing: that the tests pass. That's it. They're a strong *indication* that things are working as intended, especially when the tests have been around a while, but there's no way that, outside of a truly simplistic system, that your tests are 100% exhaustive.

100% confidence that the tests pass is still better than 0% confidence that the code does anything other than compile.

Hell, even if the tests are little more than automated markers as to how each component is used by other components, they're still worth it. I don't know how many times I've seen a developer change some component functionality for perfectly sound reasons only to run the tests and realise that some other component depended on the existing functionality.

The other component might be *wrong* to depend on that, but at least now you know about it and can fix it.

Quite frankly, this is a solved problem. Unit testing is a Good Thing. If your colleagues and management can't see that, then move on and don't waste your time.

Tests give 100% confidence in exactly one thing: that the tests pass. That's it. They're a strong *indication* that things are working as intended, especially when the tests have been around a while, but there's no way that, outside of a truly simplistic system, that your tests are 100% exhaustive.

100% confidence that the tests pass is still better than 0% confidence that the code does anything other than compile.

[...]

Quite frankly, this is a solved problem. Unit testing is a Good Thing. If your colleagues and management can't see that, then move on and don't waste your time.

I disagree.

If you have a team of beta testers that go through your product hunting bugs before it gets deployed, then you do not need unit tests. They will catch all of the bugs your unit tests would have caught *and* all the bugs your unit tests would have missed.

Unit tests are great for catching bugs earlier in the process but they also have a high development cost. Since time is a precious resource in the development world, you need to consider whether writing tests is worth it for your project. Personally, I like writing tests when they can be written simply, but don't bother with them for anything complex.