Pages

"Some birds aren't meant to be caged, their feathers are just too bright"- Morgan Freeman, Shawshank Redemption. This blog is from one such bird who couldn't be caged by organizations who mandate scripted software testing. Pradeep Soundararajan welcomes you to this blog and wishes you a good time here and even otherwise.

"I know how to do certain types of testing but if someone asks me to explain what I did, I struggle to explain" was a problem statement I heard from a tester today. I replied in a confident tone, "I know why that happens. It doesn't just appear to be your problem but I know a lot of testers who appear to have the same problem."

I continued, "You should consciously practice explaining what you did, first to yourself and then to your colleagues although they probably know how and why you did it. Why? I am going to explain how your brain works (or how I think it does). It has two nodes. One that contains what you know and the other that controls your explanation of what you know. When you make a conscious effort, you are forcing a connection between these nodes in your brain. When you force your brain to connect those two nodes too often, at some point it will judge the need for a permanent connection and create it for you. After that you have a free flow of what you know and your explanation of what you know."

After I said the above, I could see a smile in the face that reflects, "Yes, I now know how to solve this problem". I read a person's understanding not by their head nodding or when I hear, "I get it", but by the emotions and expressions on their face and the body.

James Bach identified that I was a metacog. I didn't know I was one. After that, it has been very helpful for me to understand why I do things the way I do. I guess I turned myself into a metacog because I thought it suits the kind of testing I wanted to. It's not a special status, it is just a way of life.

I don't even know if the brain works the way I explained it to be but I guess you can understand why the above explanation makes sense. I make sure I tell people that I am not trying to misinform them about anything when I use such examples. I am just helping the tester imagine why there is a problem and how to solve it. A lot of my coaching is consulting.

Examples like the one you read above govern a major part of my coaching. I observe a lot. I practiced consciously making connections between what I have seen, heard, thought, experienced and know to being explain it when the context demands.

Analogies and Examples are powerful approaches to teaching. I need to know what connects to my audience very well. Although all my audience are testers, I can't use the same analogies and examples, it just doesn't work. Bangalore testers need a different example than those in Pune. In Pune, I would talk about Raj Thackery and in Bangalore I would talk about Vattal Nagraj, in Chennai about Goundamani and not about Vattal Nagraj. Now, for those of my readers from United States or Europe, you wouldn't know Goundamani or Vattal Nagraj and hence I would use examples of Chuck Norris, Sarah Palin, Julian Assange. If you are a F1 fan, I'd talk about testing through specific GP incidents. That's how the examples need to adopt based on audience. I also use a lot of examples from what I think the world connects to. For instance, I closely read and watch Air Crash Investigation in Nat Geo channel. OMG! There is so much to learn from the way NTSB (National Transportation Safety Board) deals with it. So, basically as a coach, I have to keep connecting to various thoughts and happenings to be able to connect with my audience.

Similes, Metaphor or if you choose to call them Analogies are interesting and tough piece of cake, if you are the one providing it. People tend to take it in their own interpretation than what you intend to. It is good in a way, I get to learn when not to use what type of analogies to what kind of people :)

I use Fishing for explaining test coverage. I know a few other people have already used fishing as an example for teaching testing, so I didn't invent it but I know how to explain different concepts of testing with the fishing example. I think that most learning happens when you are interacting with the audience and not when you are explaining. That's where I do better with the fishing analogy.

I start drawing different types of fish, from guppies to clowns and from star fish to sharks. I then draw a size and shape of a specific type of net to catch fish. I give them a goal of catching as many different fish as possible in a limited amount of time. Some audience come back and tell me, "Ah with this net the guppies will escape"... what's your immediate thought? You would tell them to build a net that has smaller squares to catch guppies? (Why say it? They know it) I would probably be Pradeep and say, "Fantastic. A lot of time spent celebrating the success of finding one good bug steals away the opportunity to find more good bugs. How do you want to celebrate? 3 minutes left."

So, after they come out with strategy and different nets, I tell people that just because they have nets (tools) to catch a specific type of fish it doesn't mean they can really catch a plenty of it. I then tell a story and examples from my life. One of them: I consulted for an organization who had bought an expensive automation tool hoping that they would now be able to find more bugs. They spent all their energy to set up tests on it and found fewer bugs over the year. I drive points like: Tools don't help you unless you know how to help the tools to do what you think they can.

For a while people think it is possible to catch a shark from a net. It is possible, maybe. I explain how a powerful shark can bring their boat down if they try to catch it through a net or a fishing rod and how a harpoon is better than a net. Sometimes people try to think of similar approaches to solve many different problems. The example of shark, net and harpoons are cool for people to relate to something they have done in the past that shouldn't have been done the way they did it. I then equate different types of fish to different quality criteria. I ask my audience what type of fish are they mostly catching and I hear a shout, "Functionality".

After a lot of back and forth between me and my audience, we all discover what good test coverage could mean. I then start probing into their projects and figure out how much of fish they are missing that they shouldn't be and try to help them understand why they shouldn't be catching too much of the same fish. Sometimes it upsets the food chain :)

While there is fishing in my class, there is also plenty of room for Sine and Cosine for Test Techniques, Brian Marick's Minefield Analogy for Regression Test Strategy, Tom and Jerry examples for How Scripted Testing is dangerous, what lessons can we learn from Saurav Ganguly's come back, how to read Sachin and Kambli's career graphs, farming... It must be fun to sit in my class. I don't know, I have never been able to.

Posts & Comments

Search this blog

Copyrights

Tester Tested! by Pradeep Soundararajan is licensed under Creative Commons. You must owe credits to Pradeep Soundararajan when you copy paste anything from here by mentioning the name and proper linking to the post. You are not allowed to edit any of the post without permission. For permissions, write to pradeep.srajan@gmail.com