5 Answers
5

It's part of a tester's job to ask questions about the product that nobody else has even thought of yet. Waiting until after significant programming time has been spent to even start asking those questions - well, does that sound sensible to you?

Some of my best defects (except we called them review comments) have been raised before a single line of code was written. When working in a waterfall environment, if a project came to us for testing where static testing of the requirements hadn't been conducted (or had been conducted poorly), we considered it high risk (as such projects normally overran horribly in cost and time - if they weren't cancelled first). When working in an Agile environment, I am expected to be able to ask questions that reveal scenarios that nobody else has thought of yet.

Yes, it does mean testers need analysis skills. But then, they aren't going to be able to design good tests if they don't have those skills. If they're merely designing confirmatory tests that just check the requirements are satisfied, rather than test them - then they're not doing testing, just checking.

It is possible for a good tester to cope with stuff that just gets thrown over the wall - there are strategies to deal with that. But if you have a choice, then why wouldn't you want feedback earlier, rather than later?

"Some of my best defects (except we called them review comments) have been raised before a single line of code was written." Soooo true. I can't tell you how many times I've asked a question to find myself, the developer & the BA all had different interpritations of the spec. Much better to catch that on paper than after the code is written!
–
CKleinMay 11 '11 at 13:09

I performed some BA in addition to QA w/my previous employer & there is no doubt in my mind I tested much more thoroughly as a result.

One thing I did not expect was this: the developers were plumb crazy about the requirements documents that I wrote because I integrated the technical details with the customer's "wants" in a way they could relate to.

Although it's been at least 6 years ago, I remember them stopping by my desk and thanking me for those requirements. How often does that happen?

In my shop Testers (and Developers as well) formally review Requirements. We read the documents ahead of time, then get together with the entire project team and go through them.

We find that Testers with their domain knowledge and differing viewpoint can provide insight in several areas

checking the Requirements for testability

checking for clarity, consistency, completeness

trying to ensure a meeting of the minds between the Business and Development before the design begins

We have found that this up-front attention helps prevent bugs from occurring in the first place. And it's always better to prevent bugs, than to find them.

So if you consider this "testing", then the answer to your question "Should the reverse also be true with testers being (more) involved and performing static testing of requirements?" then my answer would be "Yes".

You might like to read the book Effective Methods for Software Testing. It has some checklists for checking things like requirements. While the book is big enough to be a door stop, it also has lots of checklists for every stage of testing and QA.

I've always found immense benefit to requirements being tested before being handed to the developer. In many cases, I prefer the developer to be testing with me. Not only do we get to learn about the application before we develop it, but, we usually end up cutting down time lost by sending back requirements after it's already been partially developed.

Also, don't limit yourself to static analysis of the requirements. Acting them out with team members taking on the roll of the user, the data, and the applications(s) can often help clarify or find problems with requirement.