We're a scrum team of 3 developers, 1 designer, the scrum master, and the product owner. However, we don't have official tester in our team. A problem that is always with us, is that, testing the application and passing those tests and removing bugs has been defined as one of the criteria to consider a PBI (Product Backlog Item) as done.

But the problem is that, no matter how much we (3 developers and 1 designer) try to test the application and implemented use cases, still some bugs are not seen and ruin our presentation (Murphy's law) to stakeholders.

As a remedy, we recommended that the company hire a new tester. Someone who's job would be testing, and testing only. An official professional tester.

However, the problem is that, scrum master and stakeholders believe that a developer (or a designer) should also be a tester.

8 Answers
8

Ex ante: There seems to be a lot of confusion on what is regarded as testing what is not. Sure, every developer needs to test his code as he creates it, he/she needs to verify it works. She/he can't hand it to a tester before he/she thinks it's done and good enough. But developers don't see everything. They might not recognize bugs. These bugs can only be found later in the development cycle when thorough testing is conducted. The question is whether developers should conduct that kind of testing or not, and in my humble opinion this needs to be looked at from a project manager's point of view:

Developers can be testers, but they shouldn't be testers. Developers tend to unintentionally/unconciously avoid to use the application in a way that might break it. That's because they wrote it and mostly test it in the way it should be used.

A good tester on the other hand, tries to to torture the application. His/her primary intention is to break it. They often use the application in ways developers wouldn't have imagined. They're closer to the users than the developer and often times have a different approach to test a workflow.

Also, using developers as testers increases development costs and does not benefit the quality of the product as much as having a dedicated tester. I wouldn't let developers cross-test their works when I can have it done better by a tester for cheap. Only if the feedback loop between developers and testers became too expensive, I'd have developers crosstest each other's code, but in my experience that is rarely the case and it highly depends on the process.

That does not mean a developer should be sloppy and leave everything to the tester. The software should be backed up by unit tests and technical errors should be reduced to a minimum before handing the software to the tester. Still, sometimes you have fix here, break there problems or other bugs that no developer could forsee, that's ok. Also, integration testing should be done mostly by the developers. The tester's main objective is to verify that the requirements are met.

In such a small team (and also depending on the size of the application), I can also see the tester in a hybrid role, writing unit tests and UI tests. You should definitely hire one.

But more important than the tester are regular freezes/branches. Don't present anything that hasn't been properly tested. When you've added a feature or changed something, everything surrounding it has to be verified again. You'll only get a bad reputation if your company doesn't. Don't release something unstable. When the customer wants to have the software by a certain date, then stop developing early enough and test it properly, so you have enough time for bug fixes. Often it's better to decline last-minute feature-requests than to implement them poorly or release without proper testing.

Strongly and vehemently disagree... Developers can be highly effective testers but the developer of a feature should NOT also be the tester of the same feature. Many small teams play both roles, by three people working on three different features, then handing off testing to one of the other three developers. It works extremely well when a team does not have a QA tester.
–
maple_shaft♦Aug 20 '11 at 12:11

3

@maple_shaft: Imho there's no excuse for not having a tester. Any project will deliver higher quality with a dedicated tester and developers can focus on, well developing if there's one. Having developers test each others code is a makeshift solution, even for small teams imho. You should read Joel's article on it, too.
–
FalconAug 20 '11 at 12:30

2

Developers can be testers - and a good developer actually knows many places where code can be weak and subject to breakage. Just never have people test the code they designed or wrote - that's useless. Other people's code may be ok.
–
StasMAug 21 '11 at 0:05

1

-1 because I can't agree with the "the time of the developer is too valuable..." statement. If a developer is spending time testing their code in an effort to improve it they are most definitely not wasting their time.
–
Bryan OakleyAug 21 '11 at 15:42

But testing your own code is not a good move - developers tend to have mental blocks about their own code, and so have difficulty in designing comprehensive or appropriate tests.

There will always be developers who think they do this well, but usually they don't (and I sure know I have many blind spots).

If you REALLY CANT hire a tester, then get developers to cross test each others work - that is, if A writes the code and does unit tests, then get B to look over those unit tests and see if there are other things could be added. And get B to try and test the code (as a user) and report defects.

This is not perfect but it is better than a single developer trying to do everything.

Sometimes your colleagues can be really good at breaking your software, because they get enjoyment from that and don't care so much - because it is not THEIR code.

Oh yes, sure. Completely agree. It's just that when you can't get 100% of what you want, you might have to settle for less. You know that less is not so good but it is better than nothing.
–
quickly_nowAug 20 '11 at 8:23

4

I generally agree with cross-testing but on some teams that'll introduce conflicts. Some people enjoy blaming others ("my stuff works, yours not, lol, I'm so much better than you") and that is unacceptable. I've witnessed that numerous times. Crosstesting should only be done between colleagues who respect each other. On my team I've introduced the nameless developer who is blamed for every bug to avoid that anyone loses his/her face. Bugs are nameless, they happen.
–
FalconAug 20 '11 at 10:47

4

+1 it is impossible to properly test your own code. It is amazing which tricks the mind can play on you - you'll be 100% sure you coded and tested some function and it will take somebody else to show to you it's actually doesn't work except in very narrow case and it'd be obvious for you once shown - but you would never see it yourself. The mind uses cognitive shortcuts, and in testing those make impossible for the person who designed and developed the code to properly test it.
–
StasMAug 21 '11 at 0:02

1

@StasM - agreed, with one small qualification: I have found that coming back to my own code months later, I can see the faults and can do a better job of testing it objectively. But test your own after writing is very hard indeed.
–
quickly_nowAug 21 '11 at 1:34

1

@Ben Aston: A developer should still be doing unit tests, integration tests, etc. Just not exclusively. The blind spots won't go away just because you want them to.
–
quickly_nowAug 21 '11 at 23:15

Should the journalist tend to write correctly? I mean, it is correctors and editors job to find all grammatical errors.

Nevertheless, journalists provide some spellcheck by themselves.
Nevertheless, corrector is a separate and important profession.

The same about developers and testers, except the fact that QA is even more important part of the development. Even if you are a good developer you just
have no time to test thoroughly all test cases, to cover all the environments, browsers, OSes your product is supporting.

If one, besides developing, constantly doing that job also, it just means a simple fact - he is part-time tester.

Well, we had two developers cross test after the first one made some changes to an entry screen. This was when our regular tester was off on maternity leave.

He basically changed an Invoice Listing screen that the users used to select invoices before zooming in to do some editing via an "Edit" button. The original list was thrown out and a new gridview inserted, with filtering, grouping, sorting and all sorts of cool functionality.

Testing went great and they uploaded the changes to the customer the next day. Two weeks later, the customer calls up and says "We really like the new thingy you put in, we can see all sorts of information now. But... er..... where do we go to edit the invoices now???"

Turns out the developer took out the check box (for selection) and edit button and since the developers always double clicked to select an item anyway, none of them found anything wrong with it......

Developers and users live in different worlds, having cross testing is better than having the developer test his own work but its still not quite the same thing.

I'd agree with their point that the developers/designers should test their code, with the cavaet that the designer/developer that did a section of code not be the only set of "eyes" on that code before committed to live. While that's not going to grab everything, it'll at least help to avoid the blindness that creeps in at testing and retesting your own code while developing.

From the mention of use case I'll assume that your'e also using code coverage tools? If not it could help see what code might not be tested, and could be popping in unexpected bugs during certain conditions.

That being said, if there's enough work or your organization is of decent size, I'd agree that a professional QA person is needed, would help to focus everyone's roles a bit more, and they could also see if there's any patterns as to what is being missed, and more importantly, how to fix it.

You should design with testability in mind but if you don't have a dedicated tester then some things will simply slip through the cracks because there are not enough hours in the day to both design, implement and test software.

no matter how much we (3 developers and 1 designer) try to test the application and implemented use cases, still some bugs are not seen and ruin our presentation... to stakeholders.

Consider performing a "controlled run" for a sprint or two, tracking dev and testing efforts separately. At the end of such run, analyze the collected data to find out how much effort you spend on testing.

If you find out that testing takes much effort, pass that data to management - it will be a compelling evidence supporting your request (much more compelling than you have now).

Otherwise (if you find your testing takes little time), consider investing additional effort into doing it better (or learning how to do that). Negotiate that additional effort you plan for with your management - because they may prefer to hire a tester instead. :)

...we recommended that the company hire a new tester. Someone who's job would be testing, and testing only. An official professional tester.

However, the problem is that, scrum master and stakeholders believe that a developer (or a designer) should also be a tester.

Have to admit, management of your company looks pretty lame to me. I mean - ok, it may be really difficult to find out how many testers would be best for the project, fine.

But to have at least one tester is just a safe bet - really funny that they hesitate to give it a try, while claiming themselves scrum / agile.

As others here have said the developers can cross-test each others' code (unit or functional testing), and perhaps your scrum master and product owner can help with integration testing, but for user acceptance testing you should be making sure you're getting lots of feedback from the customer's testing - which means frequent deployments that they can work with in the way that real users do and a really open communications channel.