Month: December 2013

October 23 DEWT and James Bach met in Dordrecht to talk about testing. The subject was “What is context-driven testing?” and more specific “How do you recognize a context-driven presentation?”.

They came together to prepare for the TestNet Autumn Event which was fully dedicated to context-driven testing and had the theme: “Exploring context-driven testing – a new hype or here to stay?”.

DEWT decided earlier this year to write an article about the event to reflect on context-driven testing in the Netherlands. What does the TestNet community have to say about context-driven testing? What can we learn from the event? What can we do to help the community learn? How context-driven is the Dutch Testing community? While discussing this the DEWTs found it hard to find heuristics to “measure” (maybe “recognize” is a better word here) the presentations at the TestNet event.

What is context-driven?

Often context-driven testing is “only” seen as an approach, but it can be more. Actually there are 3 different things that are called context-driven. Testers can be part of one, without necessarily being part of the other.

Paradigm (world view)

Community

Approach

For example: you can use an context-driven approach without being part of the context-driven community. To be in a paradigm you need to have a world view of testing.

Testing

People evaluating a product by learning about it through experimentation

driven by

is a matter organized and motivated by a systematic consideration of

context

all the factors that significantly influence the problems and solutions that lie within the scope of their mission

Attitude: context-driven testing allows to change the approach. An example is “Huib’s Rapid Software Testing”. This Huib’s way of doing testing, inspired by Rapid Software Testing, but he changed his approach to his context. Changing anything he sees fit. Often factory school testers do not want to change their approach and apply their approach the same in every situation.

To be truly context-driven you have studies and tried the practices you say you do not like

Discussion of causes and effects assuming open systems. Reject the believe that a project a well defined game that is predictable. Also reject the assumption that a process is unhackable

Acknowledgement there are people with different opinions, not speaking from authority

Social science

Humanism

Practitioner responsibility

Craftsmanship

Refuse to do bad work

Problem solving

Skill based work

Be aware of people:

talking about ‘success’ and ‘failure’ without explaining why and describing the context

talking about ‘structure’ and ‘chaos’ (often meaning: I am out of control)

showing no evidence (or using sentences like: “research shows that…”)

relying on folklore

using numbers without context

who are ignorant of social science

uncritically apply the manufacture metaphor (IT is like a factory)

apply premature automation

assuming other people read and follow what is written

contempt humanism

trying stuff once and over generalize that to the whole company/world

who fear variation

over simplify (approach the world as if it was linear)

misuse of statical analysis

who use averages without variance

who do narrow research

say or do stuff because their clients want it

demonize tacit knowledge and idealize explicit knowledge

This is the summary I have made of what we discussed. You could consider this as a heuristic for context-driven presentations. I used this heuristic in my presentation “What is context-driven testing?” at the TestNet event.

Heuristics

Success / Failure

Learning

Chaos / Structure

Compare alternative methods

Lack of evidence / Narrow research

Cause and effect

Folklore

Open systems

Numbers without context

Social science

Ignorance of social science

Humanist view

Contempt of humanism

Craftsmanship

IT & testing is like manufacturing

Explain context

No context mentioned

Worldview of testing

Generalizing after one try

Different opinions

Use of averages without variance

Allows to change approach

Linearity

Online: October 12-16 2020

Search for:

Quotes on Software Testing

Why do they never have time to do well, but always have time to do it again? (Anonymous)