Topics

Featured in Development

Peter Alvaro talks about the reasons one should engage in language design and why many of us would (or should) do something so perverse as to design a language that no one will ever use. He shares some of the extreme and sometimes obnoxious opinions that guided his design process.

Featured in AI, ML & Data Engineering

Today on The InfoQ Podcast, Wes talks with Katharine Jarmul about privacy and fairness in machine learning algorithms. Jarul discusses what’s meant by Ethical Machine Learning and some things to consider when working towards achieving fairness. Jarmul is the co-founder at KIProtect a machine learning security and privacy firm based in Germany and is one of the three keynote speakers at QCon.ai.

Featured in Culture & Methods

Organizations struggle to scale their agility. While every organization is different, common patterns explain the major challenges that most organizations face: organizational design, trying to copy others, “one-size-fits-all” scaling, scaling in siloes, and neglecting engineering practices. This article explains why, what to do about it, and how the three leading scaling frameworks compare.

Being skeptical of ourselves and of what the majority believes keeps us on our toes and forces our mind to work harder. Doubting our own - and other people's - feelings of certainty is a healthy practice that helps us solve problems and avoid bigger problems in the longer run, and it can make us better testers.

Zeger Van Hese, an independent test consultant, gave a keynote titled “the power of doubt - becoming a software skeptic” at the European Testing Conference 2018. InfoQ is covering this conference with interviews, and articles.

In the booklet The Power of Doubt Van Hese described how he embraced skepticism and became a proud and reasonable doubter, and explained how this can influence testing.

InfoQ interviewed Van Hese about how being skeptical and doubtful look in daily life, examples of heuristics for skepticism and how to apply them in testing, and what being doubtful has brought him as a tester.

InfoQ: What made you decide to become a software skeptic?

Zeger van Hese: There was no specific moment I can pinpoint, or incident that triggered it - rather a feeling that had been lingering for a number of years already and that was only getting stronger. The more experienced I got, the more I realized how much I didn’t know, how I was only scratching the surface. People around me seemed so convinced and sure of themselves, while I was full of doubt about my knowledge, my capacities and decisions to take. I had learned over the years that software teams are complex entities in which things are usually not black or white but mostly gray. Where I was able to state things with conviction and certainty in the past, experience seemed to make me doubt more than it helped me move ahead. In an area where ‘providing assurance’ is often part of the job description, this is an awkward position to be in.

Instead of burying my doubts, though, I decided to confront them. I wanted to get to the bottom of this and, for a year, decided to submerge myself in all things skeptic in hope of finding clues to help me with my testing, and with my struggles with doubt. It was a fascinating journey that I'm describing in my European Testing Conference keynote and my accompanying paper.

InfoQ: How does being skeptical and doubtful look in daily life?

Van Hese: It starts with being skeptical of ourselves, with knowing our own biases. We can't trust our eyes, ears or even our memories. Realise that we get fooled easily, on a daily basis. Knowing that we are easy to fool keeps us on our toes, and forces our mind to work harder.

I also try to be skeptical of what the majority believes. When you share your opinion with others, it becomes difficult to change your mind. It’s hard to argue against what everyone else believes, conventional wisdom and "accepted truths”. When one person posits a belief, there can be disagreement or debate. But when more than one person agrees that something is the truth, this often shuts down our own inquiry.

My adventures in skepticism taught me that we should also be skeptical of certainty. The feeling of certainty is a tricky thing. Scientific studies have shown that, despite how certainty feels, it is neither a conscious choice nor even a thought process. Certainty arises out of brain mechanisms that, like love or anger, function independently of reason. This has important implications: we ultimately cannot trust ourselves when we believe we know something to be true. Maybe sometimes we should, instead, recognise we don’t know the answer.

InfoQ: Can you give examples of heuristics for skepticism?

Van Hese: Sure. There are many heuristics to help us be more skeptical. Carl Sagan offered his own set of practical heuristics called the “Baloney Detection Kit” in his book “Demon Haunted World - Science as a Candle in the Dark”. Many others have also shared their rules of thumb on how to become an effective skeptic. Here is a small selection of the ones that I found useful:

Occam's razor

This is a very convenient rule-of-thumb. When faced with two hypotheses that explain something equally well, choose the simpler one, the one that needs the least amount of assumptions.

The backfire effect

This is what happens when you get overly attached to a hypothesis just because it’s yours. Try not to do that - a hypothesis is only a way station in the pursuit of knowledge. Ask yourself why you like the idea so much, and try to compare it fairly with the alternatives.

Don't be impressed with arguments from authority

Arguments from authority usually look like this: "I’m right because I’m an expert”. Arguments from authority carry little weight, since “authorities” have made mistakes in the past - they will do so again in the future.

“What if I’m Wrong?”

Try to ask yourself this question whenever you make important assumptions. Examining both potential sides leaves you an exit route if the information turns out to be false.

Know the Unknowns

Figure out the unknowns in any project or situation. You can’t account for every missing variable, but being aware of them will help you react if new information comes in.

Falsify

Always look for ways in which a hypothesis can be falsified. Propositions that are untestable or not falsifiable are not worth much.

Beware of the bias blind spot

The bias blind spot is the cognitive bias of recognising the impact of biases on the judgement of others, while failing to see the impact of biases on one's own judgment.

Pretend it’s April fools’ day, every day

For me this is the one skeptic trick to rule them all. April fools’ day is that one day of the year where everyone switches to "super cautious mode" about everything they see, read or hear, only to abandon that mode the day after. I do my best to remind myself it's April fool's day every day.

InfoQ: How do you apply these heuristics in testing?

Van Hese: The classic testing book “Lessons Learned in Software Testing” already mentioned this in 2002: "You’re harder to fool if you know you’re a fool”. The behaviour of software and our senses can fool us, that’s where it pays to apply these skeptical heuristics in testing.

Whenever you see graphs, charts or reports, ask yourself "What do they mean? What do they show? And especially: what don’t they show?" A graph’s purpose is usually to help you interpret data, but sometimes it just misleads us. This can affect the way we test. Graphs or reports can distort data, while test strategies and decisions are often based on that data. Looking with a critical eye helps to avoid problems further down the road.

There is a simple way to incorporate reasonable doubt in our daily testing practice: by using safety language. When you use absolutes like “It works” or “I’m sure this is the behavior”, you better be right or you risk losing credibility. I encourage everyone to start using expressions like “might”, “could be”, “so far” or “it appears”, to preserve some uncertainty.

Skeptics advise us to reject certainty and suspend belief until we have at least been able to do some fact-checking. This is a powerful way of working and something that testers can do on an almost daily basis. Never assume that the information you’ve received is the whole truth or even correct. I’m not saying that people are lying to you, but they probably don’t know the whole truth. They are telling you the truth as they see it. I could have used this advice myself when our developers assured us “you don’t need to retest that whole part of the application, the fix was elsewhere and this area wasn’t impacted.” We relied on their advice, only to find out way after the fact that important areas weren’t even working anymore. Do we really know whether something will be impacted or not? Developers - and by extension their code - move in mysterious ways sometimes.

The "Pretend it's April Fools’ day" heuristic works wonders in testing as well. When you're analysing requirements, ask yourself "What's the catch? What am I missing? Surely, this can't be all there is?" When you're testing something, and everything looks perfectly fine, ask yourself “What’s the catch? What am I not seeing?” When you see software behaving weirdly and you're about to file a bug, say to yourself "Wait a minute. Slow down. Is this really a bug? Is my system configured correctly? Could my test data be corrupt? Is the behavior I'm seeing really the bug? Or is something else causing all this?" If you read or hear something that doesn't sound right, check up on it. Find multiple sources that attribute their information.

InfoQ: What has being doubtful brought you as a tester?

Van Hese: I am aware that doubt has a bit of a doubtful reputation. After all, it doesn’t seem to get you very far. It doesn’t win you prizes, it doesn't get you promoted. As a consequence, certainty and overconfidence rule the world. But I have seen doubt correlate with wisdom, knowledge and skill as well, which makes me think that doubt surely deserves better than this. That's why I am advocating for a revaluation of doubt. We could start viewing doubt differently: not as a sign of weakness, but rather as a sign of competence and cultural baggage.

I mentioned earlier that feelings of certainty are much like anger or love: not necessarily based on correct judgment. The biggest problem is that the moment we think we know something, our brain goes in autopilot mode. From the moment we think we know the solution or the answer, we stop thinking critically about it. Doubting our own - and other people's - feelings is a healthy practice that helps us solve problems and avoid bigger problems in the longer run.

Whenever you arrive on a new team or get assigned on a new project, there is lots of information that we need to digest quickly. You have to get up to speed fast, but no one expects you to know everything from the start. “Fake it till you make it” has been a popular motto among consultants. I have known it to work, I admit I have even done it myself in the past, but I now prefer to no longer pretend. My default stance is now to say that I don’t know - yet - and abstain from conclusions until I understand things better. I encourage everyone to do the same.

I believe doubting, in a reasonable way, can make us better testers. The key to skepticism is to continuously and vigorously apply the methods of science. The biggest challenge with this is to find a balance between two seemingly contradictory attitudes: an openness to new ideas and at the same time a skeptic scrutiny of all ideas, old and new.

My final words of advice would be: doubt, but doubt reasonably. Be skeptical of everything, including yourself. Think. But don't lose your sense of wonder.

InfoQ is covering the European Testing Conference 2018 with Q&As, summaries and articles.

About the Interviewee

Zeger Van Hese considers himself a lifelong student of the software testing craft. He was program chair of Eurostar 2012 and co-founder of the Dutch Exploratory Workshop on Testing (DEWT) and its Belgian counterpart, BREWT. He muses about testing on his TestSideStory blog, tweets as @testsidestory and is a regular speaker at national and international conferences. In 2013, he founded his own company, Z-sharp.

Re: wondering

Your message is awaiting moderation. Thank you for participating in the discussion.

Hi Jeff, thank you for commenting. You are right. I didn't mean to imply an opposition. Thinking and wondering are not mutually exclusive. What I meant was - keep an open mind while at the same time applying skeptic scrutiny to all ideas, also the new ones. This is a balance that is not always easy to obtain.

Re: wondering

Your message is awaiting moderation. Thank you for participating in the discussion.

Yes, it's not easy, but you don't always have to think :)

When thinking starts to get hard, it's maybe time to look for another way, like simplifying things, or arranging them so that you don't have to rely on your intuition or on what others say etc.

For example, for testing non-trivial treatments, covering all possible cases is often not feasible off the top of your head, and we are easily fooled by some feeling that we covered them all after just testing a few.Instead, a much easier way is to delegate cases creation to randomness and run tests thousands or even millions of times in a loop.