Being skeptical of ourselves and of what the majority believes keeps us on our toes and forces our mind to work harder. Doubting our own – and other people’s – feelings of certainty is a healthy practice that helps us solve problems and avoid bigger problems in the longer run, and it can make us better testers.
Being skeptical of ourselves and of what the majority believes keeps us on our toes and forces our mind to work harder. Doubting our own – and other people’s – feelings of certainty is a healthy practice that helps us solve problems and avoid bigger problems in the longer run, and it can make us better testers.
Zeger Van Hese, an independent test consultant, gave a keynote titled “the power of doubt – becoming a software skeptic” at the European Testing Conference 2018. InfoQ is covering this conference with interviews, and articles.
In the booklet The Power of Doubt Van Hese described how he embraced skepticism and became a proud and reasonable doubter, and explained how this can influence testing.
InfoQ interviewed Van Hese about how being skeptical and doubtful look in daily life, examples of heuristics for skepticism and how to apply them in testing, and what being doubtful has brought him as a tester.
InfoQ: What made you decide to become a software skeptic?
Zeger van Hese: There was no specific moment I can pinpoint, or incident that triggered it – rather a feeling that had been lingering for a number of years already and that was only getting stronger. The more experienced I got, the more I realized how much I didn’t know, how I was only scratching the surface. People around me seemed so convinced and sure of themselves, while I was full of doubt about my knowledge, my capacities and decisions to take. I had learned over the years that software teams are complex entities in which things are usually not black or white but mostly gray. Where I was able to state things with conviction and certainty in the past, experience seemed to make me doubt more than it helped me move ahead. In an area where ‘providing assurance’ is often part of the job description, this is an awkward position to be in.
Instead of burying my doubts, though, I decided to confront them. I wanted to get to the bottom of this and, for a year, decided to submerge myself in all things skeptic in hope of finding clues to help me with my testing, and with my struggles with doubt. It was a fascinating journey that I’m describing in my European Testing Conference keynote and my accompanying paper.
InfoQ: How does being skeptical and doubtful look in daily life?
Van Hese: It starts with being skeptical of ourselves, with knowing our own biases. We can’t trust our eyes, ears or even our memories. Realise that we get fooled easily, on a daily basis. Knowing that we are easy to fool keeps us on our toes, and forces our mind to work harder.
I also try to be skeptical of what the majority believes. When you share your opinion with others, it becomes difficult to change your mind. It’s hard to argue against what everyone else believes, conventional wisdom and “accepted truths”. When one person posits a belief, there can be disagreement or debate. But when more than one person agrees that something is the truth, this often shuts down our own inquiry.
My adventures in skepticism taught me that we should also be skeptical of certainty. The feeling of certainty is a tricky thing. Scientific studies have shown that, despite how certainty feels, it is neither a conscious choice nor even a thought process. Certainty arises out of brain mechanisms that, like love or anger, function independently of reason. This has important implications: we ultimately cannot trust ourselves when we believe we know something to be true. Maybe sometimes we should, instead, recognise we don’t know the answer.
InfoQ: Can you give examples of heuristics for skepticism?
Van Hese: Sure. There are many heuristics to help us be more skeptical. Carl Sagan offered his own set of practical heuristics called the “Baloney Detection Kit” in his book “Demon Haunted World – Science as a Candle in the Dark”. Many others have also shared their rules of thumb on how to become an effective skeptic. Here is a small selection of the ones that I found useful:
Occam’s razor
This is a very convenient rule-of-thumb. When faced with two hypotheses that explain something equally well, choose the simpler one, the one that needs the least amount of assumptions.
The backfire effect
This is what happens when you get overly attached to a hypothesis just because it’s yours. Try not to do that – a hypothesis is only a way station in the pursuit of knowledge. Ask yourself why you like the idea so much, and try to compare it fairly with the alternatives.
Don’t be impressed with arguments from authority
Arguments from authority usually look like this: “I’m right because I’m an expert”. Arguments from authority carry little weight, since “authorities” have made mistakes in the past – they will do so again in the future.
“What if I’m Wrong?”
Try to ask yourself this question whenever you make important assumptions. Examining both potential sides leaves you an exit route if the information turns out to be false.
Know the Unknowns
Figure out the unknowns in any project or situation. You can’t account for every missing variable, but being aware of them will help you react if new information comes in.
Falsify
Always look for ways in which a hypothesis can be falsified. Propositions that are untestable or not falsifiable are not worth much.
Beware of the bias blind spot
The bias blind spot is the cognitive bias of recognising the impact of biases on the judgement of others, while failing to see the impact of biases on one’s own judgment.
Pretend it’s April fools’ day, every day
For me this is the one skeptic trick to rule them all. April fools’ day is that one day of the year where everyone switches to “super cautious mode” about everything they see, read or hear, only to abandon that mode the day after. I do my best to remind myself it’s April fool’s day every day.
InfoQ: How do you apply these heuristics in testing?
Van Hese: The classic testing book “Lessons Learned in Software Testing” already mentioned this in 2002: “You’re harder to fool if you know you’re a fool”. The behaviour of software and our senses can fool us, that’s where it pays to apply these skeptical heuristics in testing.
Whenever you see graphs, charts or reports, ask yourself “What do they mean? What do they show? And especially: what don’t they show?” A graph’s purpose is usually to help you interpret data, but sometimes it just misleads us. This can affect the way we test. Graphs or reports can distort data, while test strategies and decisions are often based on that data. Looking with a critical eye helps to avoid problems further down the road.
There is a simple way to incorporate reasonable doubt in our daily testing practice: by using safety language. When you use absolutes like “It works” or “I’m sure this is the behavior”, you better be right or you risk losing credibility. I encourage everyone to start using expressions like “might”, “could be”, “so far” or “it appears”, to preserve some uncertainty.
Skeptics advise us to reject certainty and suspend belief until we have at least been able to do some fact-checking. This is a powerful way of working and something that testers can do on an almost daily basis. Never assume that the information you’ve received is the whole truth or even correct. I’m not saying that people are lying to you, but they probably don’t know the whole truth. They are telling you the truth as they see it.