Thursday, July 26, 2012

You get a lot of woo-woo hand-waving when you ask a typical large-scale organizational change manager what they would put on the executive dashboard to show that their program is effective.

"That's going to vary a lot from one company to the next," they may hedge. Or "it depends" is a popular answer, also provided in long form: "it depends on what your executives want to see." Right. Like, executives who like Baroque Art want to see more Rubens. That's helpful. Not.

A Rubens work from wikipedia. You were expecting maybe an unclothed zaftig Venus?

The dashboard is often "delayed," sometimes delayed until after the change management is over! Why?

One reason a change manager may have for hedging on the topic of the dashboard is that change is difficult. A truthful dashboard is likely to show the standard change curve, reflecting the morale of the change-ees:

http://rule-of-thumb.net/2008/09/26/the-change-curve/

Change itself is difficult, prompting the Rule of Thumb blogger to compare organizational change to Elizabeth Kubler Ross's Five Stages of Grief. Okay, fine, a respectful moment of silence. But now get over yourself. You can and should dashboard, and just make a pinky pact before you start that you won't lose heart until you're 3-6 months into the experience. Plus "morale" should only be one of the dimensions you track.

Here's my suggestion. In addition to tracking your feelings, track how things are going in the dimensions of your business which caused you to be willing to change in the first place. Here's what your dashboard could look like, with no special tools beyond coaches who are paying attention, plus a spreadsheet. (I don't have Excel on my laptop, so this is a Google spreadsheet, as it happens). As is so often the case, "red" is bad, and "green" is good. I think if you click on the chart you can see it bigger:

A dashboard from a program that had several projects deliver to production after 6 weeks.

The columns from left to right are:

Week number

Agility (as measured by you and your team)

Quality (ditto)

Speed to market (ditto)

Transparency (ditto)

Customer morale (likely measured by survey of some kind)

Team morale (ditto)

Speed to Risk Mitigation (how quickly does the team identify risks, escalate them, and have them mitigated by executives? as measured by you and your team)

There are books written about how to measure each of these things. "What is Agile," alone, is a matter for endless debate, if you insist on having the whole world agree. Indeed, when you hear about agile metrics, some agile coaches never get beyond this one metric. The coaching dashboard to show "progress of the agile transformation" ends up being columns like "how many teams have a backlog?" "how many teams do standups?" "how many teams have automated testing," and so on.

Please--do your executives care? No, they do not. Normal executives do not want to be handing out t-shirts with "Hi, I'm AGILE!" on them after a team has done its eighth backlog grooming. What executives actually care about is delivery. Did you deliver software on time? Was it early? Was it earlier than before? What was the quality? What will be the total cost of ownership? Your little agile fad means nothing if it doesn't give you back something that your executives value.

But Elena, you say, how are you going to measure these things then? If it's not a number, it's "anecdotal," and that's not real to a numbers-oriented executive!

The answer is simple, and it can give you and your program a useful executive dashboard by tomorrow morning at Start of Business (SOB). Two steps:

1. Short term, aggregate facts. Don't wait until you can get numeric measures. For the first six months of your agile program, define agile in a way that makes sense in your context, and then collect facts about the running program. Not measures--but also not mere anecdotes. Find verifiable facts as you go and report them.

If your customers said at the end of your planning game that this was way better than waterfall requirements, and agree you can quote them on it, that's a fact you record in your "fact spreadsheet," which is the detail that underlies the executive dashboard.

If you are able to deploy new features into production after 6 weeks, when normally it would have taken 18 months to roll those features into a giant release, that's a fact.

After you have three or more substantive facts to support the hypothesis that things are better in a certain area, you mark the dashboard "Better" for that measure, and direct your executives to the fact spreadsheet for substantiation. If you have three or more facts in the other direction, like the whole team spends a whole week arguing about the test strategy, then you can move down one level, from "Better" to "Same" or from "Same" to "Worse." Again, the facts spreadsheet serves as the evidence, and it should accompany the dashboard as a tab or an attachment.

Excerpt from Executive Dashboard "supporting fact sheet"

2. Longer term, introduce measures. As you roll your program forward, I recommend that your definition of each column becomes more rigorous, and the values in the column become more specific than "same," "worse," or "better." You want to be able to show and measure how quickly the money your company spent on coaching, training, books, and brightly colored Sharpies paid itself back monetarily.

Use your agile project management software to measure how many projects are keeping track of themselves on a virtual card wall.

Analyze your code base for better quality using tools like Sonar.

Factually track how frequently you are now moving new software into production, or into a pre-production environment.

Survey your customers--are they happier? What percentage?

Survey your team--how are things going there? Using what numbers? Maybe you can start measuring morale in terms of reduced numbers of people leaving the team, if you have enough data to prove it.

Measures are your friend. But don't let the seeming perfection of numerically measured progress obviate your responsibility to report using a dashboard to your executives right from the first day you hit the ground.

Dashboarding is simple. The choice to make the data under the dashboard more complex is a separate business decision which requires its own costs and justifications. Don't wait, and if you're a company getting help with your transformation, don't settle for consultants who wave you away and say it's going to take years for you to see results, "once we have all the widgets up and running." Keep your program results factually charted from the in a Big Visible Way right from the beginning. Your executives will be grateful, and you will retain your job log another triumph! Tally ho!

Wednesday, July 25, 2012

From the first, I had been a pretty avid Android person, partly just because I wanted to be nonconformist. But after starting out strong with the original Motorola "Droid," I fell prey to two Samsung Galaxy II phones in a row that had poor call quality. (Yes, I sailed right from "shame on you" to "shame on me" in that little transaction). I returned the first one FIVE times to AT&T before they decided it was a "lemon" and let me upgrade to the next phone up.

From att.com!

Imagine my horror when I discovered that the Samsung Galaxy Skyrocket II had the same problem. I spent hours getting software upgrades in various AT&T stores across America, and after a replacement or two, I finally checked out the interweb and found numerous geek-oriented sites that documented something I would have loved to have known a while ago:

Samsung phones are not known for call quality and some Galaxies II have a specific problem in which the speaker's voice is muffled on all calls. You can hear, but you can't be heard.

This apparently doesn't matter to 98% of the Samsung Galaxy II owners out there, probably because, like my teenage daughter, these people don't actually use their phone to "talk to people." They text and email and whatnot. I was astounded to find that a best-selling phone didn't need to be an operable, well, phone.

So in a fit of rebellion, I bought an iPhone 4S. My Apple friends smirked and welcomed me to the the cult, er fold. But right away I was severely unhappy, even though I will freely admit that:

The iPhone has perfectly good call quality.

So did my original "clamshell" phone, and it was pink. I was hoping that I could have a "smartphone" that was as smart as my Samsung Galaxy IIs had been, only also with the ability to carry a clear voice signal. So if you're about to cravenly cave in andbuy an iPhone just like your neighbors on the train, here's what I learned in my roughly four weeks on the iPhone:

The iPhone screen is much smaller. If you are old, and I am, you will miss the extra real estate. I like to read electronic books when I'm on the bus or the plane, and it's handy to do so on my phone. The tiny screen made me sad. Which brings me to:

iPhone does not let you buy Kindle books directly from the iPhone Kindle application. Apple wants you to switch from Kindle to iBooks, so they have purposely hobbled the iPhone Kindle app--AND the Amazon app--so that they don't let you buy Kindle books. You have to literally bring up the amazon.com web site on your tiny, tiny screen, and buy the Kindle book from the web. It is annoying, and it's meant to be.

iPhone does not give you spoken turn-by-turn navigation from Google maps. You can buy an app to do it, but the app I found, which was the cheapest one, I admit, had a tiny 1x1 inch map and a horrible user interface. Android gives you spoken turn-by-turn navigation for free. I gather that Apple has recently announced better navigation in a future phone, but you know what, the future was 2 years ago, if you were on Android. And while we're on the topic of turn-by-turn navigation in the car,

The iPhone Operating System doesn't expect you to try to play music while navigating. What I like to do is jump into the rental car, plug my phone into the audio system, fire up a play list, turn on navigation, and drive. Then, the music will automatically be muted so I can hear the driving instructions yelling for me to turn left in 1000 feet. You can get this in your iPhone app that you pay for, but it's weird and kind of funky.

There is no good Google mail client app for iPhone. I am not sure who is more eager to have gmail not work on iPhone--Google or Apple--but I'm not switching email addresses right now, any more than I'm going to drop my investment in Kindle books in favor of re-buying the same books from iBooks.

The much vaunted Apple interface is not intuitive. I know Apple must know best, but really. If I get a call on an Android phone, the screen lets me choose from "Accept Call" and "Reject Call." Apple only lets me "Accept." At first I thought Apple was naively trying to control my life by making me accept all calls, but a developer friend pointed out that all you have to do is press the on-off switch to reject the call. Oh, right, that makes sense. Or how about this--I want to sort my applications by name. So they will be alphabetical. Sorry, you can't do that on the iPhone. Or maybe you can, by turning it upside-down three times and baying at the moon. You would know that because--wait, no, too much packaging to sell the phone with "Instructions."

Anyway, if you are a gmail user who travels and reads Kindle books, the iPhone may not be for you. Under other circumstances it might be. There are no value judgements here. Just observations. Last week I gave my iPhone to my partner and bought a new Samsung Galaxy S3, which does every smart thing you could ever want on a huge screen AND has call quality. The end.

Monday, July 23, 2012

One of the grim proving grounds for the would-be agile business analyst (henceforth "WBABA") is the "traceability conversation." Eventually, you will have to have one. You may have seen one already. If you haven't, you may want to half-avert your eyes as you read further. It gets a little brutal. But if you close them all the way, you can't read.

WBABA: ...so in summary, we complete analysis on each story card, and then we support the developers as they build it that same iteration!

Corporate Standards Guy: but how do you do traceability in agile? You have to have traceability. It's broadly recognized as an important factor in
building rigorous software systems. These software systems permeate our
society and we must entrust them with lives of everyday people on a
daily basis. [The last two sentences are an actual quotation from the Center of Excellence for Software Traceability website!]

WBABA: [cowed silence]

Corporate Standards Guy: right. Well, go ahead and do your little story cards, but I expect that before you start papering the war room wall in Post-It note chartreuse, you will present me with a full Business Requirements Document and Functional Requirements Document, so you can trace everything through to the Systems Architectural Diagram and on into the test cases. Show me the traceability matrix, and we're all good.

It appears as though someone has thrown a wrench into the speedy agile SDLC which will kill it altogether, or at least blow out its kneecap.

But have no fear! Unlike "pair programming," a sadly controversial practice in which the debate could go either way on any given day, "traceability" is ground on which you can beat Corporate Standards Guy down ANY day, hoisting him with his own petard! How? Why? Because...agile is actually better at traceability than waterfall. Embrace the dark side and let's take a look at that.

What is Traceability and Why Does it Matter?

All Center of Excellence hyperbole aside, "traceability" could be factually described as "the use of tracking and tracing systems and processes to match the incoming product requirements to outgoing product attributes." Why would you want to do that? Typically three positive reasons, and one negative one.

Positively:

"Forward" traceability: so you can ensure at any point that everything you require is in the plan somewhere, or, better yet, in the actual software.

"Backward" traceability: so you can ensure that everything in the software was developed for some identifiable reason, and it wasn't just developers running amok.

Keeping things tidy when requirements change, both during the project and during ongoing maintenance afterwards: if requirements change, you would like to know what the impact of the change will be--which pieces of the system will be impacted? How many of them are there?

Negatively:

So you can know who to blame: if something is missing in the product attributes. Was that missing attribute somewhere in the requirements? If not, blame the business! (Or every team's favorite culprits, the BAs). If it was there in the requirements, then figure out where along the chain the ball was dropped, and bingo! there's your culprit. Did the designer forget it? Did the developer disregard the design? Did the tester fail to notice it was working wrong? And so on.

The best way to put this is that where waterfall is a translation process, agile is a refinement process.

Translation: As suggested in the illustration above, a waterfall software development process takes a list of requirements from the customer, which may or may not be well explained, translates them into "official business requirements" language, then into "official functional requirements language," then into "design elements," then into "software components," then into "documentation chapters." This process is exactly analogous to passing a requirement through a succession of online translations:

Original: "I need a drop-down to select toppings on the web site for my pizza store."

Suddenly, the pepperoni has a seat at the monitor. In which language did the customer get throttled by his own pizza? We don't know now, but give us a traceability matrix showing word for word translations at each step, and we'll figure it out.

"That's absurd!" you say. But is it absurd? How does a customer follow the requirements inventory as it progresses through the development life cycle? Would elements written in Haitian Creole be any harder for her to understand than page 37 of the Functional Requirements Document?

Refinement: In an agile process, requirements are refined in place, not translated. In fact, for purposes of forward and backward traceability, you literally build out a requirements matrix from the beginning, and develop in a way that keeps traceability intact. What is this matrix? In agile, it is called "the backlog." Each backlog item (called "a story") is a description of a piece of business functionality which the system will provide, described from the point of view of a person doing something in particular with the software. So these steps look more like this:

Planning: the whole team gets together and builds a high-level model of the drop-down for the pizza store page, including a frank discussion about the power struggle between people and food, and the need for people to win.

Story splitting: since adding a drop-down to the page is a complex task that would take more than 2 weeks, the story would be split into smaller stories. Let's say in this case that story 1 is "add the drop-down control to the screen with one choice, pepperoni", story 2 is "get everyone in the store to agree on what the rest of the toppings are, and put those choices in as well," and story 3 is "subdue the rebelling garnishes."

In iteration 1, we might tackle the garnish rebellion, defining success as "pepperoni does not wield weapons observably in the kitchen." We might also have a developer add a drop-down control on the web page showing pepperoni to customers as a thing they can order on their pizza, defining success as getting calls into the pizza store from 10 people who confirmed they had visited the web site and identified pepperoni pizza as something they could buy.

In iteration 2, we might spend the whole time getting a gigantic multi-lingual kitchen staff to agree to add pineapple, canned corn, and Spam as additional ingredients, and refine the system test criteria to say that customers should identify all four of these as choices from which they were picking.

Note that as we go, traceability doesn't need to be maintained as a separate activity, because each system feature is being built out from general description into detailed implementation, and okayed as "complete" when it meets pre-agreed acceptance criteria, and the customer herself nodding in agreement. If requirements need to change, new stories would be introduced to override behavior not needed any more on the page (for example, sales of Spam and pineapple might be lower than expected), and through grouping in the requirements repository, stories related to the drop-down could be viewed in sequential order, to show that first one thing was implemented, then another.

Fans of the classic Frances Hodgeson Burnett story for girls, The Little Princess, will remember the point in that book where Sara Crewe tries to explain to the evil Miss Minchin that she doesn't need to learn French. Miss Minchin overrides her and gives Sarah a basic vocabulary book so she can get started. When the French master arrives at the school, she conducts a lengthy conversation with him--in French--and he exclaims, "Madame, Sarah does not need to learn French because she IS French!"

Agile is like that. You don't need to expend effort in agile developing a traceability matrix, because the agile requirements repository IS a traceability matrix at its core.

Sunday, July 15, 2012

Many agilists do not deal well with corporate hierarchy. "In the agile world," they say smugly, "we will all be equal." Quick quiz: although the venerable "pigs and chickens" metaphor is now sliding out of favor among the agile cool kids, which of the following are agile transformation coaches still very likely to do:

Characterize the direct line managers of team members as "people to keep out of the room at all costs, and if they must be there, to be tolerated only if they maintain a respectful silence."

Describe the entire group between CEO and the team doing the work as "middle managers--the people I wish didn't exist."

Make cruel and dismissive comments about "command and control."

Wear jeans to work every day, not just on Fridays, in violation of the corporate summer dress code.

Yes, sadly, the correct answer is "all of the above," although item 4 is just a passive-aggressive manifestation of the first three.

But hostility toward management is not a good idea in a corporate agile transformation. In fact, if your goal is to succeed, and not merely to seem "agile cool," the entire hierarchy can and should be your staunchest friend and ally. Pragmatic coaches need to start every coaching engagement by establishing a sensible escalation plan for good news and for bad.

Goofily, this comes from a D&D site: http://rolesrules.blogspot.com/2012/03/high-level-d-combat-general-escalation.html

What do you need?

Demonstrated executive cover. The executive at the very top of the organization you wish to transform should have publically stated she supports the change, and she should be willing to push the concept down through her chain of command, even visiting people's office to say things like "how's the agile transformation going, Gary?"

Demonstrated incentives for line management to support your change, and disincentives for them to thwart the change. In a large company, you should expect management at every level to be skeptical of change. They've been through TQM, Six Sigma, and any number of other fads. They don't care how cool you are. They care about results they can use to advance their careers. What is in it for them? The answer may be as simple as "top executive management has called this a priority." I hope the answer is also that you are measuring your project's success in terms of improvements to the business bottom line. Every manager benefits from being able to brag that their projects are faster, higher quality, more predictable, better for the team, and better for the customer.

Executive attendance at project kick-offs and other important events. Executives from the IT and business organizations should attend your project kick-off (you do have a project kickoff, don't you?). This may not be executives at the very top, but it could be the "Garys," the people at the top of your line of business or vertical, if not at the top of the whole tree. The Garys should explain the strategic importance of the project from their respective points of view, and they should explain why it is important for the team to be tackling the project in an agile manner, if agile is new to the environment.

An escalation protocol for how the team should react to bad experiences with their coach. Because you are the consultant, you should start by offering the protocol for the team's escalations regarding the coach or coaches. Team members with issues should always start by bringing the issue to the coach's attention directly. If the problem can't be resolved there, the team should know who the coach's boss is, and be prepared to let the coach know that the boss is going to be pulled in. If there is a hierarchy of some kind over the coach (manager of coaches, reporting to director of transformation program, reporting to transformation executive sponsor, reporting to the CEO, or what-have-you), those lines should be established right from the start. At every stage, you should agree that if the team member does not get satisfaction, they should let the person at the current level know that they will be escalating to the next level.

Same protocol for coaches with team issues. Inevitably in a corporate change program, the change-ees may be unwilling or unable to move forward on a new philosophy or technique that the change-er (the coach) thinks is important. You must establish before you start the coaching that when this happens, chain of command will be observed. The coach will work first directly with the problematic party, then with their manager, and then all the way up the chain to the executive sponsor. The executive who decided that they want an agile organization must ultimately determine whether the coach's viewpoint should be enforced on teams or not.

Or to put it another way, precedent-setting decisions about the philosophy and practice belonging to your organizational change should not be made locally by the team. This is not "Lord of the Flies." Someone is investing heavily in paying coach salaries, or you wouldn't be there. The person controlling that budget needs to ensure that the coaches are behaving, and that problems arising on teams can be resolved appropriately for the whole program, not just at the preference of the local team members.

I recommend that you understand what you need for an escalation path, and that you put together, yes, that anti-agile concept, a written statement of understanding of how you will escalate as needed, for every team being coached. Name names and put time frames into this document.

But Elena, you say, this is pretty freaking negative. Okay, yes, in a way, it is. That's why I started by saying that you need your escalation path not just for bad news, but for good. Establish metrics showing success in a way your management chain can brag about, and make sure you pump those positive messages up your escalation chain at every appropriate opportunity (without spamming people. Management hates spam).

Robert Frost's wall, from http://en.wikipedia.org/wiki/File:Mendingwall.JPG

In Robert Frost's poem, "The Mending Wall," the narrator's stodgy neighbor intones, "Good fences make good neighbors." Like Frost, of course, we all know that good neighbors make good neighbors, and your agile transformation will either succeed or fail primarily due to the good will you show, and the good work your teams perform. But let's build up the fences as an enabling step to breaking them down. There's a paradox here: this contract for escalation actually enables and empowers communication within your team, across the organization, and up the hierarchy.

About Me

Elena is a Principal Business Architect for ThoughtWorks, London. In this capacity, she focuses on transforming business architecture to better support digitally enabled retail clients. Prior to ThoughtWorks, Elena was a Program Manager and Chief Agilist for the Treasury Services vertical at JPMorgan Chase, followed by projects which measurably improved scalability and productivity in IT processes for the Corporate and Investment Bank (CIB) and the Consumer and Community Bank (CCB). In addition to business architecture, Elena’s areas of professional interest are value chain mapping, change management, and non-annoying IT productivity strategy and measurement tactics.