Friday, January 29, 2016

I've previously talked about the disaster itself. It was an odd moment in history - like the Kennedy assassination or September 11th, everyone who was alive seems to know where they were on that night, and their thoughts and feelings.

It's my own tale I want to spend some time going through, as it links to some recent posts of mine. I was 15, and remember my head full of revision for an upcoming physics test on radiation, when I saw the fateful footage of the Challenger.

I seemed to spend the whole night channel hopping for news and theories. I'd known about the plan to send the first teacher into space ... we'd heard how safe and reliable the shuttle was ... were there any survivors (for a few moments there was a parachute spotted which confused everyone and gave people false hope).

It was memorable for me, because it was the penultimate time that I did a childhood tradition we probably all have had. I couldn't sleep, so I went into my parents room to get into bed with them.

I told my parents that I just could not get them out of my head. It upset me, but also confused me a bit. The news was always full of terrible things - children being abducted and murdered, natural disasters, planes shot down. But this really got to me.

The disaster is memorable for the words my mother used to explain why I was feeling that way. I was a kid with my head filled with space and science fiction. Those astronauts were everything I wanted and sought to be, and that made me empathise closely with them. They were people very much like me, doing something I desperately wanted to follow in their footsteps. Though I didn't know them that well, it made it feel incredibly personal and scary because I saw so much of myself in them.

It's a comment I use mentally all the time to explain the world around me. We see terrible things in the news all the time - and often it's not the numbers of people affected or killed which will get to us, but the personal tales which remind us how close the victims are to us. That is why we change the colour of our Facebook photo after a terrorist attack in Paris, and yet seem blase about attacks elsewhere in the world where many more people die. We see more of ourselves in the Parisian victims.

On the 30th anniversary, I'm reminded that the challenge with empathy is sometimes we need to spread it a bit wider. To not just put ourselves in the shoes of people like us, but wherever we can, to put ourselves in the shoes of people a little different.

Tuesday, January 19, 2016

Previously we looked at the then-to-be-released new Star Wars movie as a way to explore some psychological phenomena which we'd unconsciously be exposed to in December 2015.

Today I want to focus on one of those key factors (group delusion), and see how they affect us at work as testers.

Group Delusion

The term "group delusion" feels an awfully derogatory one - but as I discussed last time, it's subtle and can affect us all. I personally see it as akin of peer pressure. To me there's somewhat of a grey-blur between them, and they add up to the same kind of thing - an unconscious drift in our thinking.

I want you to imagine this scenario ...You've just meet up with a couple of testers, Bev and Steve, who you worked with two years ago. Although it was meant to be more a social catch-up, it doesn't take too long before you start talking shop!Bev mentioned how they've recently started implementing a MacGuffin strategy to their testing, and she's finding it really interesting, but a bit challenging. Steve snorts a bit - his project has been using MacGuffin for over a year, and would never go back!That evening, you get some email spam from a recruiter (you remind yourself you really should unsubscribe one of these days), when you notice the first job calls for "experience applying MacGuffin is a must!". You scroll down, and it's not the only role which mentions it.It only gets worse the next day when one of the senior managers calls you and several other testers into an office asking why you've not started implementing a MacGuffin policy yet ...So we need to get down and start implementing MacGuffin right?

Hopefully your first reaction really is "what is MacGuffin, and why will it help?". Sadly not everyone will.

A MacGuffin is a great term for this kind of effect in testing - it's a word often associated with Alfred Hitchcock, it's tool used as a plot device, and often kept vague and mysterious.

The mysterious MacGuffin used in Pulp Fiction

Over the last few years I've seen a few MacGuffins being traded around social media in association with testing. I notice the MacGuffin effect when I start to see people wanting to build up their strategy not because a MacGuffin will help them with a specific problem, but because everyone else seems to be working with MacGuffins, and they're worried about being left behind.

It doesn't matter what they are, there is a peer pressure which creates a kind of what's called a "keeping up with the Joneses" effect. If your neighbours, the Joneses, go out and get a new BMW and a 50 inch plasma TV, you end up asking yourself why you don't have these things, and thinking that really you ought to go out and get them yourself. Even if it's something that previously you felt you never needed.

Let me give you a few examples of MacGuffins, and I'll break down the pitfalls I've seen ...

Test automation

Agile

Test management tools

Test automation

I'm actually a huge fan of automation (though I agree, it' more checking than testing). Used well it can help my job as a tester because it will mean a developer can check that software works to at least a level where I can interact a little with it. If it's so fundamentally broken that a user can't even log into the system, far better that's found with a little automation, than for me to find on Monday, then be told I can't get a build for another week.

The problem is automation is somewhat over-sold. This really goes back to the 90s, where the sales blurb for the expensive software talked in terms of there being no need for testers. All in the order of "costs half a testers wage, tests 100 times faster!".

The problem is you need a really good tester to come up with the ideas for scenarios to check. Then you need someone to program it all up. Then to run it. And check the results if they show issues. Then to maintain it when you change your code. You still need people, and they're kind of doing "testing-esque" roles (especially determining scenarios, and investigation). [Though you might need less of them]

If one of your tests "shows a red" (ie a fail), I guess you'll need to try and manually rerun the scenario to see why you get that result ... isn't that most likely to be a tester?

So even in a perfect scenario, you're still going to need people working testing (sorry if you bought the sales pitch).

However when you're sitting with Bev and Steve and they're going "automation is amazing ... it helps us to test really fast", everyone would want a piece of that action! Especially if they scoff and go, "what .... you're STILL testing manually?".

I myself have experienced this kind of pressure - I had a manager at a previous company want to know why we weren't using automation for our current project. This project was an integrated piece of hardware similar in many respects to a supermarket self-service kiosk. They'd heard you could get some free software called Selenium, and a friend was working it on another project nearby.

Well, that was an unexpected item in my bagging area!

I was lucky to be able to go through the details with this manager, and get them to understand why it wasn't suitable, and I was glad we were able to go through it together.

My points were,

For automation to work well, we really needed the product in our hands in as finished a state as possible quite early on. The reality was we'd scripted up some scenarios, but we would not get so much as a demo of the product until it was delivered for testing (and believe me I'd tried). We were on a tight schedule, and so automation would jeopardise that.

It takes a long time to automate a test. But it pays off if you expect to have to run that test dozens of time. We had a 6-8 week window, expecting to rerun tests at the most about 3-4 times. Once released we'd not touch this product again for years (if ever).

Primarily though, Selenium works on web-based application. Our application was not even remotely web-based. The technology just wasn't suitable. I had to go through some Selenium websites to show this to him, but again worth doing so he'd understand.

All the same, it's easy to see how he got MacGuffined.

Agile

Again - I'm a huge fan of agile. Just not when it's mis-sold.

I see and hear this a little - companies who've felt compelled to "jump on the agile bandwagon" because they've heard about everyone else's huge successes. But without really "getting what agile is".

Agile though is a hard sell - when you tell customers the reality that "agile isn't about succeeding, it's about making failure more up-front so it causes less damage, cost and is easily rectified", some look a little shocked and terrified.

I thought you promised us success!

Agile has become such a big MacGuffin, that no IT company can afford to say they don't do it. But unfortunately there's a little bit of "just call it agile" out there, where the principles aren't really being followed or understood, it's just a rebranding of what was done in waterfall. As I've talked about addressing our agile transformation - there's an awful cargo cult trap of trying to keep doing what you've always done, but "wack an agile label" on it.

Warning signs for me tend to be,

"Well ... we do stand ups" [That is - that's the only difference which has been noticed]

"We have a 2 week development sprint ... which is followed by a 2 week testing sprint" [Mini-waterfall]

"We' then run our retrospective ideas past management" [Sounds awfully like the agile team are not empowered]

I'm often told by their advocates that "the best thing about this McGuffin Test Management Tool is that you get all your reporting for free".

Actually, let me repeat this and add emphasis - "the best thing about this McGuffin Test Management Tool is that you get all your reporting FOR FREE!!!".

Just like agile and just like automation, test management tools can be a very useful. For very complex projects, with huge numbers of testers, it allows you to break down the many areas of testing into different areas, track which tests touch which requirements. And that's useful.

When advocates of these tools scoff with "well, you can't just use Excel, can you", I usually squirm a little, and reply "well, often Excel is my tool of choice". Am I mad?

Here's my problem with test management tools. Many of them whilst they do allow you an overview, it comes with a price, which is not free. And I don't mean the "per seat license dollar cost".

Test management tools often constrain testers to work and test in a particular fashion - one which often is not the natural/logical/comfortable/most-efficient method to how some testers operate - especially exploratory testers. If you have a tool which isn't really suitable for how you work, it feels clunky, it slows you down, and creates a drag on your work velocity.

They can also give a false perspective - because the kind of high-up managers who love them rarely dig down to the detail. A couple of years ago, I had one such manager mention how nice it was to be able to see 500 scripts scoped out for the testing effort, and like a commander they could watch their status when run. When I looked into some of these tests, they had a title, but the majority of them had no steps ... It's something I'm aware of. I know of some projects around Wellington where they had the test management software mandated, and the testers found using the system for testing so difficult and clunky, they ended up only using it to track bugs.

This is how some people think the charts in test management tools work ...

I like to use Excel when practical, because I can use it in any manner which makes sense for me to keep notes for testing and for breaking down the effort. If a team is less than 5 testers, you probably don't need one.

I also know looking at graph or output from these tools does not show me where we have problems (and I'm a trained scientist good are reading trends in graphs, but also noticing noise). Often those problems can lie under the surface of the pretty graphs. Far better to have a daily stand-up with other testers, and get them to talk about any problems they're encountering. That avoids me having to stare at numbers or statuses on a graph or dashboard trying to "read the signs" like some fortune teller. I trust my testers to tell me about problems than I do a tool. If I don't trust them, I shouldn't have them working for me.

Likewise when it comes to scaling up a group, I'd rather have 15 testers split into 3-4 groups with a team lead, and all those team leads doing a stand up with me daily to pass down problems and pain points. Even if we're using a test tool, it's important to keep talking to the people using it.

[I'm going to avoid a discussion on test metrics and reporting here, as I've something upcoming on just that topic]
So that's the MacGuffin effect leading us into sometimes costly mistakes! Any of that feel at all familiar? Any other MacGuffins you've come across?

Next time we'll look at the phenomena of denialism, and the sting in the tale there...

Tuesday, January 12, 2016

I think that describes a lot of people's reaction to yesterday's announcement of the death of David Bowie at 69 from cancer. It's poignant because I think everyone takes a moment not just to remember him, but their own personal family heroes who were similarly lost to one cancer or another. I like many feel huge empathy. Not only was he a musical hero, but because we've know something of the struggle and the loss to cancer personally (for me, my beloved father-in-law especially comes to mind).

On social media, many of my friends are posting tributes to Bowie, and links to their favourite songs. There are so many good songs, and no two friends seem to have chosen the same one - probably a testament to his back-catalogue.

David Bowie was often referred to as the "chameleon of pop". He reinvented himself and his music regularly. There are many bands and singers out there who find a sound and a style, and stick to it their whole life. He didn't.

He likewise was unafraid to dabble in acting - and was particularly superb as the haunted alien of The Man Who Fell To Earth.

Bowie was never afraid to experiment, to collaborate and try new things. Consequentially, among so many great songs, there are a few that just don't work for me. I'm thinking about his cover of the Doors Moon Of Alabama - however even there he didn't just copy and paste, but dared to try something very different.

He sung with Freddie Mercury. And Mick Jagger. And Bing Crosby. Bing Crosby!

What remains with us is almost 50 years of music. But his memory should propel us to embrace what he did. To embrace working with others, daring to cross the floor to collaborate with others, to not let a box confine and define us. To avoid forging a career that repeats the same playlist of actions our whole life, but to be daring, try new things and reinvent ourselves.

Do everything with passion - although try and avoid the cocaine habit. And if occasionally being daring doesn't work, don't be afraid to try again.

Thursday, January 7, 2016

Today was my first day back at work, and on the long walk to work, I'm typically reflective on what I'd really like to see out of this year.

I was reminded this morning of something the late Leonard Nimoy said about explaining the popularity of Star Trek, "I think people enjoy watching this group of characters solve problems together".

It applies to both Star Trek and The Next Generation - but it also on the flip side to Bob Marshall's comments this morning (which really annoyed me in case you hadn't guessed), it is also a sign of a good team. You are within a group of people who you enjoy solving problems with.

Last year I worked with a really good agile team - although to start with like many teams we had our dramas before we got sorted. Just as we felt we were running well, we ran out of work for the project. It was frustrating, but once we got there, and built up that trust, it was the kind of group you wanted to remain with.

It's the kind of environment we'd all like to find ourselves in, and indeed as Leonard Nimoy stated, a good reason why people are drawn to the positivity within Star Trek. Although there were a few egos in play behind the scenes, on screen everyone works together well.

For example, you'd never find Kirk sitting in Sulu's seat going, "piloting this starship is easy Sulu - you just flip a few switches ... anyone could do it".

Likewise - although Chekov could operate the science station, it was acknowledged that Spock was better at it, and Chekov was a better navigator.

But no-one could really stand in for Uhura.

What's interesting is that during the series there was very little politics or in-fighting between these guys (unless possessed by an alien parasite). This is a team which had well and truly in the performing end of the forming-storming-norming-performing spectrum.

And of course when they retold the story in Star Trek, they took the whole team as newbies, and put them through trials to form, storm, norm then finally perform. The key to performing is having respect, and typically in such movies, it's a journey to earning that respect.

It sometimes seems that there is nothing more turbulent and egotistical than the music industry. I know this having friends who've been in bands, and indeed following the troubled life of some of my favourite bands like The House Of Love.

The drummer who gets dirty looks and the blame if someone messes up a song, and who occasionally somebody would be cheaper to replace with a drum machine.

Egos galore - everyone thinking that it's their instrument and talent which is/should be dominant, and the cause of success. The hardened truth is that the band's sound is a contribution of all the talent at work, not just one element ...

A Freddie Mercury song doesn't sound quite the same as a Queen song (and they certainly don't sound the same without him).

The Smiths last album didn't sound quite the same without the guitar of Johnny Marr. Likewise for The House Of Love when Terry Bickers left.

Much like for agile teams, there is no "magic line-up" for a good band. The one thing that seems to be agreed though is you need some kind of variety - even an acapella band which has only singers has singers with different vocal ranges!

Rivalling the music industry for turbulence though has to be the IT industry, especially with agile groups. This really shouldn't be a surprise - the same human dynamics and egos are in play, the fact that it's a different field changes the context a little, but there are surprising similarities. As was reinforced to me this morning.

Bob Marshall generally tends to talk a lot about his values of non-violent communication when it comes to dealing with developers such as himself. I've however found his attitude a bit frustrating, summed up best by Lisa Crispin ...

I have to say Lisa's responses really summed up a lot of frustration out there. I've noticed a bit of behaviour similar to this on Bob's Twitter before, where I've felt he's been playing the role of prima donna developer a bit too much, and has shown behaviour which has felt like being a borderline troll at times. This is especially as he believes in non-violent communication ... but only when it applies to managers telling developers what to do. Ultimately though, I'm reminded of the pig Napoleon from Animal Farm, who wants to do away with manager-led dictatorships, and replace it with a developer-centric one.

I want to be fair to Bob, because he did do a very similar Tweet about "funny how few devs/managers question the basic premise of their trade", however this was very after-the-fact, and did not include the tirade against testers which I've found a little too frequent on his Twitter of late.

I'm sorry, but I side with Lisa on the complete frustration on this one. I have had to come up across similar attitudes, and often from developers about how little testers contribute. And I'm getting a bit tired of having to have it - ultimately the best way to judge the value of testers is to try delivering to production without one, and let me know how that works out for you. Most of the cautionary tales around the IT industry tend to revolve around "we didn't test enough ... but got it into production on time". But we still seem to occasionally have someone who thinks it won't happen to them or that "testing costs too much, and gives very little value".

What we're left with then is very much the rock star band model above. We have a small group, but big egos. We're used to the idea of the lead singer / "software manager" who thinks they run the group, and their word is law. Sadly Bob is not alone amongst the developers I've met (thankfully far from all), who is a guitarist who thinks the best kind of band would ONLY have people on guitars, because guitars are cool.

Sadly I feel sometimes we testers are considered to be the Spinal Tap drummer of the group, who can be easily replaced, perhaps with a machine (sound familiar?).

One problem with perceptions of testing is one I've been talking a lot on Twitter about. When I learned software engineering from a book by Michael J. Pont (I am a programmer remember), I still remember the opening which talked about increasing complexity in software (and this was back in the 90s). The core to being a software engineer is about managing that complexity and breaking it down, further and further until you were left with something simple.

Now - and this might be a shock - but the same is true of testing. Testing is a big, complex, amorphous heap of unknown. As testers we use experience, inquiry and investigation of every possible aspect of a piece of software and form a strategy to break this down. I was pleased to read Sami Soderblom's post which repeated this theme.

You might use a matrix or a mind map or even a test planning tool. But sooner or later you've broken it down to the simplest unit you can - a test scenario. And sadly on seeing this simple, individual test scenario, the reaction is all too similar "anyone could do that" (execute the test scenario). Without appreciating that the skill in testing is being able to apply strategy to come up with those scenario.

Working alongside good agile teams, I'm often hit with the same comment from developers - that we test scenarios and find problems they just wouldn't have thought of doing. This is because knowing the strategy and application of heuristics is as skilled as any subtle programming language. And many of us have learned this by using software over decades. And like any trademan, what we do looks incredibly simple, until you have someone inexperienced try it.

But for some reason we don't see this in reverse. If a developer broke her hand, and I had to help her meet a deadline by typing the last line of code,

obj.sum = value_1 + value_2;

It could be argued that the line is very simple, and easy to understand. So maybe development isn't so hard. Heck - maybe because I could type that line, I should say we don't need developers any more if it's so simple?

Fundamentally any team like any band needs some kind of diversity of skill within it. Coding is an obvious must. Testing not so obvious - but believe me, if you have no-one skilled in testing, eventually you'll know about it, and it probably won't be pleasant.

Finally ...

I got the following response from Bob this evening ... obviously he has some super-secret recipe for success if you'd like to try it ...

Or else he's just leading a lot of his followers into dangerous waters ...

Additional reading

Since I've written this article, a number of people have given me similar articles they've blogged about, and I'm going to include them here ...

About Me

I am a tester & critical thinker. This blog is where I write about and explore the things that matter to me, in all their weird and wonderful forms ...
The views inside are my own, and don't represent those of any company I've worked for.