Pages

Tuesday, 5 February 2013

I didn't call this "the" trap, as I think there are several. This is one aspect.

In June last year I had a little twitter rant/exploration (or transpective investigation, [1]) about regression testing. It was triggered when I saw a LinkedIn skill labelled as "Regression Testing". I wanted to know which skills that people considered specific to regression testing.

Now, one can take some of the LinkedIn labels with a pinch of salt, but I wanted to test the water about this one.

Note, I really enjoy these "bursts of activity" I have on twitter. There are many great minds that immediately question, probe, ask for explanation or example. I love it and am very appreciative to anyone that asks me to explain myself. It makes me stronger and clarifies my thoughts!

The trap is to think of the skills associated with good regression testing as being separate from good testing.

In my context and experience, I consider good testing to take into account aspects of new features that interact with legacy (existing) features or systems. This implies that "regression testing" is included. But if I think of specific skills needed for regression testing then these are just as applicable to "non-regression testing":

Investigative skills about the domain (system)

Discussing development and the existing system with developers/BA's/stakeholders

Test Idea / Environment Analysis & Maintenance

Reporting and discussing the status of the system

etc, etc

So, is a "regression testing" skill separate or unique to regression testing? I assert not, and that it can be just as common in "non-regression testing" or any good testing.