Have you ever (or are you currently) working in an organisation with any Positive Discrimination policies? Where, for example, there is a stated aim to have 25% of the board as female or 30% of the workforce from ethnic groups that are not of the majority ethnic group in your geographic location? How do you feel about that? Is positive discrimination a good thing or a bad thing? I can’t decide.

{Big Caveat! Before anyone wants to give me the same sort of hassle as a tiny few did recently over a related post, note that I am just wondering aloud and whilst I encourage comments and feedback, I reserve the right to block or delete any comments that I feel are abusive or discriminatory or simply from the unhinged. Just saying. Also I am mostly going to reference women as the aim for positive discrimination, as the blog got really untidy when I swapped between different types of discrimination. I apologise if anyone is offended by that – it is not intended.}

I don’t think I’ve ever been comfortable with the concept of positive discrimination and if I wind back the clock to my early 20’s, back then I was quite angrily dead set against it – on the grounds that it is still discrimination. It seemed to me then that it was a simple yin/yang concept. If discrimination is wrong, it’s wrong and “positive” discrimination is in fact just discrimination against the majority. Wrong is wrong. Stealing is wrong, be it from the poor or the rich or from organisations. All those post-it notes I’ve stolen over the years? Bad Martin.

So what has changed about my opinion? Well, I think that as we all get older we tend to be able to better consider the wider picture and less black/white about most of our philosophies {my personal opinion is that those who don’t modify their opinions in light of more experience and greater thought are, well, not maturing}. I can’t but accept that the business/IT work place as a whole is male-dominated and is riddled with sexism. This does not mean *at all* that all or even most men in business/IT are sexist, but the statistics, studies and countless personal experiences make it clear that the pay, success and respect of women are impacted.
A way to counteract that is to encourage more women to work in IT (or science or whichever area they are under-represented in) and show that they are just as effective in senior positions by tipping the balance in their favor. Positive discrimination is one way of doing that. Is the small evil of this type of discrimination acceptable if it first counteracts and then helps overturn and melt the large evil of the massive inequalities we currently have? Once equality is there (or you are at least approaching it) you drop the little evil of positive discrimination? But how else do you balance the books until the issue has been addressed? My own perception is that sexism and racism at least are reduced from what they were when I first started working, maybe positive discrimination is a significant factor in that? Maybe it is more that society has shifted?

Part of me likes the Women In Technology {try search on hashtag #WIT but you get loads of things that are labelled as “witty” as well} events and discussions such as supported in the Oracle sphere by Kellyn PotVin-Gorman and Debra Lilley amongst others. I much prefer to have a balanced workforce. But when I’ve been to a talk about it or seen online discussions, there often seems to be an element of “we hate men” or “all men are out to put us down” that, frankly, insults me. In fairness I’ve also seen that element questioned or stopped by the female moderators so I know they are aware of the problem of Men Bashing. After all, for reasons I have gone into in a prior post, as a small man I empathise with some of their issues – so to be told all men are the problem is both personally an affront and also… Yes, it’s discrimination. I should not have to feel I need to justify my own non-sexism but I do – My work, hiring and promoting history demonstrates I treat both sexes as equal. If I think you are rubbish at your job, it has nothing to do with how many X chromosomes you have.

I mentioned above “the little evil of positive discrimination” and that is certainly how I see it. I think of it as wrong not just because of the yin/yang simplistic take on right and wrong but because positive discrimination can have negative effects. Forcing a percentage of the workforce or management to be from a specified group means you are potentially not hiring the best candidates or putting the less capable into those positions. If your workforce is 10% female, not at all unusual in IT, then it is unlikely the best candidates for management are 25% female. They might be, it might be that 40% of them are female as they have managed to demonstrate their capabilities and stick with the industry despite any extra challenges faced. But to have a false percentage strikes me as problematic. Another issue is that of perceived unfair advantage or protection. How would any of us feel if we did not get a job or position as someone else got it on the basis of their sex, colour or disability to fulfill a quota? People are often bad tempered enough when they fail to get what they want. Over all, I think positive discrimination leads to a level of unease or resentment in the larger group not being aided. NOTE – I mean on average. I do not mean that everyone (or even most) feels resentment. And those who do vary in how much each individual feels upset by it.

I know a few people, including myself, who have hit big problems when disciplining or even sacking someone who is not a white male. I’ve had HR say to me “we are going to have to be very careful with this as they are {not-white-male}”. I asked the direct question of would this be easier if the person was a white male? – And they said, frankly, yes. It’s hard not to let that get your back up. I’ve seen this make someone I felt was pretty liberal and balanced become quite bigoted. That is positive discrimination being a little evil and having exactly the opposite effect as intended. That HR department was, in my opinion, getting it wrong – but I’ve heard so many similar stories that I feel it is the same in most HR departments across the UK, US and maybe Europe too. I can’t speak about other places.

I know a few women who are also very uncomfortable with positive discrimination as it makes them feel that either they got something not on the basis of their own abilities or others see it that way from looking in.

I’ve occasionally seen the disparity in numbers seen as a positive – I knew a lady at college who loved the fact she was only one of 3 women out of just over a hundred people in her year doing a degree in Computer Science. I was chatting to her {at a Sci-fi society evening, where she was also markedly out-numbered by the opposite sex} about how it must be daunting. She laughed at me in scorn – It was great! She said she stuck out and so got better responses when she asked questions in lectures, she had no trouble getting help off the over-worked tutors as they were keen to be seen to not be discriminatory and, as you mostly “met people” via your course or your societies, she pretty much had her pick of a hundred+ men. That told me.

So all in all, I still do not know if I am for or against positive discrimination. I guess I just wish it was not necessary. If there really was no discrimination, we would not question how many female, black, asian, disabled, short, fat, ginger, protestant people there were doing whatever we do.

{sorry for the lack of humour this week, I just struggled to squeeze it into such a delicate topic}

How do I know if Dave is doing his job properly? If I am his (or her*) manager, what techniques can I use to ensure I am getting my pound of flesh out of this worker drone in return for the exorbitant salary my company puts into said drone’s bank account each month?

Well, as a start there is my last Friday Philosophy all about deduction of work profile via auditory analysis of input devices (ie how fast is Dave typing) :-) I have to say, the response to that topic has been very good, I’ve had a few chats with people about it and got some interesting comments on the blog article itself. My blog hits went Ping :-)

However, I have a confession to make. I have a “history” in respect of keyboards and management of staff. Maybe one of my old colleagues will comment to confirm this, but I used to regularly walk into an office full of “my people” and bark “Type faster you B*****ds! I don’t care what it is you are doing, I just want to see those fingers flying over the keyboard!”. They all knew to ignore me, this was just one example of my pathetic sense of humour. In some ways, I was never a very good manager as I was just a bit too juvenile, irreverent and non-managerial.

I was being ironic and they knew it. I had no time for many of the Management Easy Options you so often come across in organisations that are used to apparently help ensure the staff are working hard. What do I mean by Management Easy Options? I’ll cover a few.

You have to be at your desk for at least 8 hours.

At Your Desk. Because if you are at your desk you are working of course. And if you are not at your desk, you are not working. Hours at the desk apparently equate to productivity. So a Management Easy Option is to insist all your staff are seen to be in the office and at their desk for as long as, and preferably longer, than the average time across all staff. And that is partly why in dysfunctional companies staff are in the office so long. As if lots of managers want to demonstrate that they are “good managers” by having their staff “productive” at their desks, their staff will be there longer than average…which pushes up the average…so they keep the staff there longer… *sigh*

I could spend a few pages on the academic and psychological studies that disprove the above nonsense about 8 hours of productive work – but we all know it is nonsense anyway. We talk about it at lunch or in the pub. If you are stuck at your desk longer than you can concentrate, you do other stuff that is hard to distinguish from work. Or you do poor work. WE ALL KNOW THIS so why does this myth about hours-at-desk continue? What happens to some manager’s brains such that they start managing and soon stop knowing this?!?

As a self employed worker in the London IT market, I often get given a contract to sign that specifies I must do a professional working day:- that “consists of 8 hours minimum each day”. For the last 5 or 6 years I have always crossed out that clause or altered it to say “8 hours maximum” or replaced it with what I feel should be the real clause, which is:

A professional working day, which is to, on average across a week, match or exceed the requirements of my manager for a day’s productivity.

If I am being asked to work a Professional Working Day then to me that means I have to achieve a day’s worth of benefit to the company for each day paid to me. If that takes me 8 hours or 6 or 9 or whatever is immaterial. As a Professional I will on average, each day, keep my manager happy that I am worth employing. If that involves 6 hours of extra work one day between 8pm and 2am, fine. But do not expect 8 hours the next day. If my manager is not happy, then you ask me to go and I will go. It really is as simple as that.

{honesty forces me to admit that at present, for the first time in years, I have that 40 hour clause in place. Because I am doing a role for a friend, and I did not want to cause a fuss by objecting to the clause. But if management ever refer to the clause, my friend knows I will simply thank management for their time to date – and I’ll be going now}.

I drifted into my own world there, but the point I really wanted to make is that hours spent at the desk in no way indicate if the job is being done. We all know that, all the managers know that (well, they will if they are any good). Some people can be at their desk 10 hours a day and, frankly, it would help the company if they were not! Other people are at their desk but spend a huge slice of the time on the web or Instant Messaging or *cough* writing blogs.

You have to be in the office.

If you are at home, you will be goofing off.
So what does the above say about the manager if that is their opinion? If you are at home, you would goof off, so therefore your staff will? Of course working from home has other considerations, such as it is only possible if your role allows you to spend some days not physically doing things in the office (pressing reset buttons on boxes? Making tea for the team?) and you are in the office enough to maintain and make proper bridges with your colleagues. I also think working from home is a privilege to earn and not a right, as some people really are incapable of working from home. I had a role a while back where when one chap was “working from home” he was actually doing all sorts of things – but his smartphone was set up to fake an online presence. He was incapable of working from home.

But in IT there really is not a need for many of us to spend all that time and unpleasantness commuting and some tasks really are done more efficiently if people can’t keep coming up to your desk and demanding their personal priorities really are your priorities too (which usually equates to they are in it up to their necks and you can dig them out).

Enforce a Clean Desk policy.

Now, there are things that should not ever be left on your desk. Financial information, personal information (like people’s CVs or annual reviews), management information (salary reviews, plans to axe 22% of the workforce, stuff like that) but I have no time at all for the argument that a clean desk looks more professional. It does not look more professional, that is just weaselly, lying balls. It looks more like someone has implemented a draconian clean desk policy and any sign of the desk occupants being human is of no consideration.

If you walk into an office with 300 utterly clean desks, it looks like a soul-less, bitter and degrading place to work slave.

You walk into an office and you see pictures of offspring & partners, little toys (not my thing but some people like to have the gonk their boy/girlfriend gave them) and that’s just fine.

Yeah, if Malcolm has a pile of 237 Diet Coke cans in a pyramid on his desk that is not so hot, but as a manager it is your job to go tell Malcolm to recycle those damn cans. And for those of us who work in Clean Desk environments, we all know we spend a few minutes each morning pulling stuff out of our pedestals and a few minutes each evening chucking it all back in there. Great use of time, oh management clean desk police. So the Management Easy Option is to make everyone remove all signs of humanity and *also* waste time moving all useful things off your desk each evening and drag them out each morning, rather than occasionally check what people leave on their desk and, when Cherry has left details of the latest dodgy plan to hide details from the FDA on her desk, give her a seriously hard talking to.

In one job I did not have desk pedestal, I had a locker – “Over There” at the other side of the office where my first allotted desk was. It took two or three trips each morning and end of the day to sort out my stuff and keep my desk “clean”. At least I docked it off the 8 hour day…

So having moaned about a few of these Easy Management Options that, in my opinion, are detrimental – how do you ensure Dave is Productive? Now, this is a complex and challenging idea and I am not sure some managers will understand it. But, the way you can tell if Dave is productive is that…

He Does His Job.

He completes the tasks assigned to him in the time frame that is reasonable or informs you of the reasons why the tasks are taking longer. If Dave’s role includes scooping up issues and solving them autonomously, you know Dave is doing his job as the end users are not screaming at you. In fact, if as a manger you are barely aware of Dave existing, either he is doing his job exceedingly well or you employed him to do a non-existent job (so more fool you). The bottom line is that, as Dave’s manager, your job is to to aid Dave do his job, overcome obstacle and track that his tasks are done.. ie be a proper manager, not rule by Easy Management Options.

Bottom line, to get back to my first paragraph or two, it matters not one jot how fast Dave types. If (s)he is in the office for the meetings and any core hours needed, fine. So long as a member of staff is not doing things that negatively impact their ability to do their job or those around them to do theirs, there are few blanket rules that help. All those Easy Management Options simply exist to cover the backsides of poor managers and satisfy the desire for control that comes from HR and upper management. Neither of which *Ever* abide by the rules they lay down on others.

Break free! Type slowly! Put a picture of Debbie Harry on your desk. Work from home and Go Crazy spending an hour in the afternoon combing the dog. Just make sure you do your job. In my book, that makes you worth your pay. Is it really so hard to manage people in that way?!?

(*) I have yet to meet a lady called Dave, but Dave is simply my generic name for someone working in IT. No real Dave is implied. But both sexes are.

You know how it goes. You get a call/mail/text with something along the lines of “I need to know all the details of customer orders placed on Tuesday 7th by customers based in Botswana – and I need it ASAP, by end of play today at the latest”. So you skip lunch, drop that task you have been trying to get around to doing all week and work out how to resolve the issue that has just been dropped on you. It takes a lot of effort and you finally get it sorted out around an hour after you told your girlfriend/boyfriend/cat you would be leaving the office that day – and mail it off to the requestor. You might even call them to let them know it is done, but oddly they don’t answer.

Next day, you see the guy who wanted this urgent request and ask if it was what they wanted “Oh, I have not looked at it yet – but thanks for doing it.”

NO! “Thanks” does not work in this situation. I’d have more respect for this guy if he laughed at me and said “got you again, sucker”. Many of you know what I mean don’t you – if you are in a support-type-role, this can be a big part of your life.

I had a job years back that seemed to consist 90% of such tasks. I was the development DBA team leader responsible for testing, validating and promoting code to production. Everyone’s changes were Urgency Level 1, to be done as an emergency release and many could not be put in place until after 5pm. I’d be sat there at 18:30 in a massive but virtually empty office, applying changes along with one or two of my guys. Everyone else had gone home. This was not once or twice a month, it was 4 or 5 times a week. What are you to do?

Well, I came up with one tactic that seemed to work pretty well.

Anyone who asked for an emergency change had to be there, on site, available when the change was done.
There were of course cries of protest and people stated it was ridiculous that they had to be there, they were not needed, the change had been tested thoroughly {oh how I laughed at that – a thoroughly tested “emergency” change huh?}. No, I replied, you had to be there in case it went wrong as it’s your system, your data and, frankly, your emergency. If it is not urgent enough for you – the guy wanting it to be done – to be inconvenienced, well it sure as hell is not urgent enough to inconvenience me. “You can call if there are problems” – What, after you have escaped the locality? Maybe turned off your phone? And if I get you , I have to wait for you to come back in? No no no. Urgent emergency now equates to presence in office. After all, I’ll be there.

I stuck to my rule. If the requester could not be bothered to stay, I downgraded the request to “Planned” and put it through the CAB process. If the requester dumped on one of their team and made them stay, I mentally marked them half a point down and factored it in next emergency.

The change was remarkable. I was no longer in the office on my own every evening. I was not there with someone else either. I was simply not there as, when you made the emergency a little bit inconvenient to the requester, it magically stopped being an emergency.

There was another change. Less cock-ups. Seeing as these changes now went through the CAB process and slightly more testing {like, some testing} the duff changes were more likely to be detected before they caused damage. My bosses went from regarding me as “not a team player” to “Not a team player – but we kind of get your point now”.

So my advice is, if someone wants to try and make something your emergency, find some way of making sure it remains inconvenient to them. If they are willing to put up with the inconvenience, then it is a real emergency and you need to crack on with it.

If you manage people, it helps if they don’t dislike you. Sadly, this can be the default starting opinion for some people who have never been managers (we all know someone who “has never had a decent manager, they are all bloody idiots”). Frozen dairy products might be a route to easing this situation.

I mention this as we in the UK are having an unusually warm start to autumn, an Indian Summer as we call it. I used to work in a place that had an on-site cafe and a nice area outside to sit. If the weather was warm and I knew my team was not facing some crisis, I would occasionally pop my head around the door and announce “Team Ice-Cream!”. Anyone who wished could come down with me and I would buy them an ice-cream of their choice and we would sit out in the sun for 15 minutes and talk rubbish.

I’ve done similar in other situations. Taking the guys to the pub is the obvious one and it usually is appreciated, but in some ways it is less successful. I think this is because people will come to the pub because they want a pint and will put up with any idiot willing to provide a pint of Fosters {why is it so many of the “all managers are idiots” brigade drink some brand of nasty lager?}. People will come for a tea/coffee or an ice-cream only if they are at least ambivalent to the provider. If you really dislike someone, who cares about an ice-cream? The serious malcontents will stay away and this helps identify people who really are not happy with you {so you can beat them mercilessly of course – or, if you’ve progressed beyond the school-yard, put some thought into why they are unhappy and what to do about it}.

By the way, this is very different to everyone going to the pub/restaurant in the evening and spending hours telling people what “you really think” and trying to impress Jessica the new trainee/intern. Such team building events generally need much more planning.

Buying people an ice-cream (or coffee or whatever) is a cheap bribe – should you resort to such shallow tactics to make people like you? Well, it’s only a cheap bribe as I said above. The trick to it is that it has to be {almost} spontaneous, such that the team are not expecting it, and not all the time. I’m not sure the teams I have done this for have always appreciated that I made special efforts to do this either after a hard period of work or when there had been some malcontent internal within the team (people fall out, they argue, it impacts the rest of the team). The way I look at it, it also has to be a team thing and not an individual thing, as the sitting around talking rubbish is a key part to the team being a team. Even if it is just over a cup of nasty coffee in the basement – that particular company’s canteen was not the best.

Oh, I should mention that I have access to a wife that makes wonderful cakes. Left-over cake is a brilliant “team ice-cream” substitute, it is both “cheap” so not a bribe but also appreciated as someone put effort in. My wife in this case. I Never claim I made the cake. well, not often.

TeddyBear Picnic Cake

So, that’s the carrot. What about the stick?

When it comes down to it, as a manager you are there to guide the team and the individuals in it and get the best you can out of them. Not being disliked is important but you are not there to be their friend either. If someone transgresses, you need to correct them.

In my opinion one of the very worst things a manager can do is dress down a member of their staff in public. That is not correcting them, that is either an attempt to humiliate them or an attempt by the boss to scape-goat the blame to a subordinate. Neither is morally correct and both are highly likely to engender considerable dislike or even hatred.

I distinctly remember one situation where I was in a team meeting and the boss’s managers came in and wanted to know why a recent change had gone so badly wrong. The manager’s response was immediate, he picked one of the team and said something like “It was him, he didn’t test the change properly”. It was so obvious that the sub-text was “it was not my fault”. In reality the sacrificed staff member was not at fault – but at that point the boss sure as heck was. A manager gets paid more as a boss and part of the reason is that you take both the credit and the blame for your team’s efforts. This action by that boss did not make us scared of failing and thus work harder, it made us distrust the man and demoralised us.

Sadly it is something I’ve seen a lot over the years and never by what I would call a good manager. I just don’t understand why these people think a public dressing down is going to inspire the target or the audience to work more effectively.

If I’m in the situation where, in a meeting or discussion, it becomes obvious one of my guys has screwed up, we discuss how to sort it out as a team. Then after the meeting the transgressor and I have a private conversation. This has several benefits:

I am not publicly humiliating them or scoring points in front of a crowd.

Neither of us is playing to the crowd and so are more likely to be honest.

Things can be said that stay private. I’ve had team members mess things up because they have more important issues on their mind that they are uncomfortable with the team knowing about. At the other end of the spectrum I’ve had to tell a guy this is chance #last and the next step is disciplinary.

This never happens, but there is a very small theoretical chance I could have misunderstood and, in fact, it’s my fault. I’m lying of course, this has happened several times. You look a right idiot if you attempt to dress someone down in public and it turns out to be you.

As I said, that last point has happened to me as a Boss occasionally. I’ve also experienced that last point from the other side as well. In a large meeting I had a board member pushing me as to why we had not finished a project on the date I promised. I kept giving vague answers about “other things coming up” and it would be done by a new, given date. She would not let it go though so eventually I had to say “It is late because you told me to do other stuff as top priority, I raised this project and you told me to delay it. So it is late because you changed the priorities. That would make it your responsibility.”

She was very, very angry but it had been her choice to do this publicly. At least she wrapped up the meeting fairly calmly before dragging me into her office to shout at me and I had the chance to really tell her what I thought of her bullying style in meetings. We got on a lot better after that. I think I bought her an ice-cream.

All this boils down to – Reward the team in public. Chastise the individual in private.

In my previous post I asked the question “why doesn’t Agile work?”. I’m not sure the nuance of the question came over correctly.

I’d just like to highlight that the question I asked was “Why does agile not work”. It was not “Why is Agile rubbish“. I’ve said a few times in the past couple of weeks that I like the ideology of Agile and I am (and have been for years and years) a strong proponent of prototyping, cyclic development, test driven design and many other things that are part of the Agile or XP methodologies.

That distinction in the title is a really important distinction and one I’d hoped I’d made clear in my post. Looking back at my post though, I think it is clear I failed :-(. I highlighted reasons why I think Agile does not work and in my head I was thinking “if we avoid these, Agile could work” – but when you write something down it does not matter what is in your head if it does not reach the paper.

I’m actually frustrated that in the last few years I have not seen Agile really succeed and also that this must be the normal situation, going on the response you get when the topic of Agile comes up with fellow technicians and comments on my own blog.

However, on that post about Agile two people who’s opinion I deeply respect came back at me to say “Agile does work!”. Cary Millsap, who many of you will have heard of as the “Method R” guy and the person behind Oracle Flexible Architecture. And Mike Cox, who most of you won’t have heard of but Mike taught me a lot about sensible development back in the 90’s. He’s one of the best developers I have ever had the pleasure of working with and I know he has had great success with Agile and RED. I’m not sure if they read my post as “Agile is Rubbish” or they are, like me, simply frustrated that it can work but so often does not.

So I’ve been thinking about this a lot this weekend and I was helped by Cary’s paper on the topic that he mentioned in his comment. I’d highly recommend downloading it as it is an excellent description of not only why Agile can help but describes how and some of the pitfalls {I’d started my own post on that, but go read Cary’s}. I should add, you can see Cary present his case for Agile at the UKOUG conference this year.

So where does this bring me to? Well, I think “Is Agile good or bad” has become almost an “IT religion” topic, people love it or loath it and it is based on what they have seen of the methodology in real life. No, that’s wrong, it is based on what they have seen that has been labelled with that methodology in real life. Or worse, it is based on anecdotal opinion of those around them. The thing is, if you look at what XP is supposed to consist of or what Agile Programming is supposed to consist of, most of us would agree that a great deal of it makes sense in many situations. I’d disagree with some of the details in Cary’s paper but overall I’m in strong agreement. Sadly, What Agile and XP is supposed to be is not well matched by what you see on the ground in most cases. So even if these methodologies are right for the situation, what has been implemented is not the methodology but probably more a slap-dash process that simply jettisons documentation, design and proper testing. This whole thread sprung from my lamenting the demise of database design and several of the comments highlighted that the introduction of Agile seemed to equate, at least in part, with the demise of design. As MIke and Cary say, and as I think anyone who has successfully utilized Agile would say, Design is an integral part of Agile and XP methodology.

Agile can and does work. But many things can and do work, such as taking regular exercise to keep healthy or regularly maintaining your house to keep it weathertight. Like Agile, both take effort but the overall benefit is greater than the cost. And like Agile, do it wrong and you can make things worse. If your window frames are starting to rot and you just slap a new layer of top-coat on them all you will do is seal in the damp and rot and hide the problem – until the glass falls out. Going for a regular 5 mile run is good for you – but not if you are 10 stone (60KG) overweight and have not run in years. A 5 mile run is also not a good idea if you want to be a long-jumper. Right training (methodology) for the right aim. Also, just like keeping healthy, house maintenance or anything that takes effort but works, proponents tend towards extremism – probably as a reaction to the constant {perceived} pig-headedness of critics or the failure of people to just do what now seems so sensible to them {think reformed smokers}. I’ll have to buy Cary and Mike pints to make up for that jibe now, and promise them it was not aimed at them personally…

Sadly, the reality is, Agile does not work 90% of the time it is tried. So, does that mean Agile is actually rubbish? Or at least, not fit for purpose, because many companies are not able to use it? Companies are there to achieve something and the IT systems are part of achieving that something. If Agile cannot aid that IT department then Agile is the wrong way for that department and company.

*sigh* I’ve gone on and on about this and still not got to my own main point, which is this.

- Can we identify reasons for Agile and XP Failing.
– Having identified the Reasons, can we fix them in simple ways?
– Can we create some simple guidelines as to when a project should be more Agile and when it should be more Up-Front design.

I’m going say right here at the start that I like much of what is in Agile, for many, many years I’ve used aspects of Rapid Application Development {which Agile seems to have borrowed extensively from} to great success. However, after my post last week on database design, many of the comments were quite negative about Agile – and I had not even mentioned it in my post!

To nail my flag to the post though, I have not seen an Agile-managed project yet that gave me confidence that Agile itself was really helping to produce a better product, a product more quickly and most certainly not a final system that was going to be easy to maintain. Bring up the topic of Agile with other experienced IT people and I would estimate 90% of the feedback is negative.

That last point about ongoing maintenance of the system is the killer one for me. On the last few projects I have been on where the culture was Agile-fixated I just constantly had this little voice in my head going:

“How is anyone going to know why you did that in six months? You’ve just bolted that onto the side of the design like a kludge and it really is a kludge. When you just said in the standup meeting that we will address that issue ‘later’, is that the same “later” that accounts for the other half-dozen issues that seem to have been forgotten?”.

From what I can determine after the fact, that voice turns out to be reason screaming out against insanity. A major reason Agile fails is that it is implemented in a way that has no consideration for post-implementation.

Agile, as it is often implemented, is all about a headlong rush to get the job done super-quick. Ignore all distractions, work harder, be completely focused and be smarter. It really does seem to be the attitude by those who impose Agile that by being Agile your staff will magically come up with more innovative solutions and will adapt to any change in requirements just because they work under an agile methodology. Go Agile, increase their IQ by 10 points and their work capacity by 25%. Well, it doesn’t work like that. Some people can in fact think on their feet and pull solutions out of thin air, but they can do that irrespective of the methodology. People who are more completer-finishers, who need a while to change direction but boy do they produce good stuff, have you just demoralized and hamstrung them?Agile does not suit the way all people work and to succeed those people it does not suit need to be considered.

The other thing that seems to be a constant theme under Agile is utterly knackered {sorry, UK slang, knackered means tired, worn out and a bit broken} staff. Every scrum is a mad panic to shove it all out of the door and people stop doing other things to cope. Like helping outside the group or keeping an eye on that dodgy process they just adopted as it needed doing. Agile fails when it is used to beat up team. Also, I believe Agile fails when those ‘distractions’ are ignored by everyone and work that does not fall neatly into a scrum is not done.

I suppose it does not help that my role has usually been one that is more Production Support than development and Agile is incompatible with production support. Take the idea of the scrum, where you have x days to analyse, plan, design, unit test and integrate the 6 things you will do in this round. On average I only spend 50% of my time dealing with urgent production issues, so I get allocated several tasks. Guess what, if I end up spending 75% of my time that week on urgent production issues, and urgent production issues have to take priority, I can screw up the scrum all on my own. No, I can’t pass my tasks onto others in the team as (a) they are all fully assigned to their tasks and (b) passing over a task takes extra time. Agile fails when it is used for the wrong teams and work type.

I’ve come to the conclusion that on most projects Agile has some beneficial impact in getting tasks done, as it forces people to justify what they have done each and every day, encourages communication and gives the developers a reason to ignore anything else that could be distracting them as it is not in the scrum. Probably any methodology would help with all of that.

My final issue with Agile is the idiot fanatics. At one customer site I spent a while at, they had an Agile Coach come around to help the team to become more agile. I thought this was a little odd as this team was actually doing a reasonable job with Agile, they had increased productivity and had managed to avoid the worst of the potential negative impacts. This man came along and patronisingly told us we were doing OK, but it was hard for us to raise our game like this, we just needed help to see the core values of Agile and, once we did, once we really believed in it, productivity would go up 500% {That is a direct quote, he actually said “productivity will go up by 500%”}. He was sparkly-eyed and animated and full of the granite confidence of the seriously self-deluded. I think he managed to put back the benefits of Agile by 50%, such was the level of “inspiration” he gave us. Agile fails when it is implemented like a religion. It’s just a methodolgy guys.

I find it all quite depressing as I strongly suspect that, if you had a good team in a positive environment, doing a focused job, Agile could reap great rewards. I’m assured by some of my friends that this is the case. {update – it took my good friend Mike less than an hour to chime in with a comment. I think I hit a nerve}.

A while ago whilst working on one project, a colleague came back to his desk next to mine and exclaimed “I hate working with that team! – they are so bad that it makes everyone who works with them look incompetent!”

Now there is often an argument to be made that working with people who are not good at their job can be great for you, as you always looks good in comparison {it’s like the old adage about hanging around with someone less attractive than you – but I’ve never found anyone I can do that with…}. It is to an extent true of course, and though it can seem a negative attitude, it is also an opportunity to teach these people and help them improve, so everyone potentially is a winner. I actually enjoy working with people who are clueless, so long as they will accept the clues. You leave them in a better state than when you joined them.

However, my friend was in the situation where the team he was dealing with was so lacking in the skills required that if you provided them with code that worked as specified, which passed back the values stated in the correct format derived from the database with the right logic… their application code would still fall over with exceptions – because it was written to a very, very “strict” interpretation of the spec.

In one example, the specification for a module included a “screen shot” showing 3 detail items being displayed for the parent object. So the application team had written code to accept only up to 3 detail items. Any more and it would crash. Not error, crash. The other part of the application, which the same people in the application team had also written, would let you create as many detail items for the parent as you liked. The data model stated there could be many more than 3 detail items. I suppose you could argue that the specification for the module failed to state “allow more than three items” – but there was a gap in the screen to allow more data, there was the data model and there was the wider concept of the application. In a second example, the same PL/SQL package was used to populate a screen in several modes. Depending on the mode, certain fields were populated or not. The application however would fail if the variables for these unused fields were null. Or it would fail if they were populated. The decision for each one depended on the day that bit of the module had been written, it would seem. *sigh*

The situation was made worse by the team manager being a skilled political animal, who would always try to shift any blame to any and all other teams as his first reaction. In the above examples he tried to immediately lay the blame with my colleague and then with the specification, but my colleague had managed to interpret the spec fine (he did the outrageous thing of asking questions if he was not sure or checked the data model). Further, this manager did not seem to like his people asking us questions, as he felt it would make it look like they did not know what they were doing. Oddly enough they did NOT know what they were doing. Anyway, as a consequence of the manager’s hostile attitude, the opportunity to actually teach the poor staff was strictly limited.

That was really the root of the problem, the manager. It was not the fault of the team members that they could not do the job – they had not had proper training, were unpracticed with the skills, siloed into their team, not encouraged to think beyond the single task in front of them and there was no one available to show them any better. The issue was that they were being made to do work they were not able to do. The problem, to my mind, was with the manager and with the culture of that part of the organisation that did not deal with that manager. He obviously did not believe that rule one of a good manager is to look after the best interests of your team. It was to protect his own backside.

But the bottom line was that this team was so bad that anything they were involved in was a disaster and no one wants to be part of a disaster. If you worked with them, you were part of the disaster. So we took the pragmatic approach. When they had the spec wrong, if we would alter our code to cope, we would alter our code. And document that. It gave us a lot of work and we ended up having a lot of “bugs” allocated to our team. But it got the app out almost on time. On-going maintencance could be a bit of an issue but we did what we could on our side to spell out the odditites.

I still know my friend from above and he still can’t talk about it in the pub without getting really quite agitated :-)

If you go into a book shop there will probably be a section on business and, if there is, there will almost certainly be a load of books on how to be a manager. Shelves and shelves of them. There is also a large and vibrant market in selling courses on management and aspects of management. I’ve been on a couple of such course and, if you can manage to be open minded whilst keeping a cynical edge, I think they can be useful.

However, I think I most of them are missing the key points and that if you can but hold on to the following extensive list of guiding principles you will be a good IT manager. Maybe even an excellent one :-):

Your top priority, at all times, is to see to the best interests of your people.

Whatever you develop, be it code, databases, network, a team of support staff – User Acceptance is paramount.

You must find ways to deal with other teams and your own management hierarchy in such a way as to be allowed to do (1) and (2).

That’s it.

OK, if pushed, I’d say Never Lie. Maybe that’s just personal though, it’s because I don’t have the memory, audacity or swiftness of mind to pull it off. By not lying I don’t have to try and construct what I said to who and why.

I’m sure people could cite some other hard rules like “you must be within budget” or “you need to get buy-in to your vision” but I don’t agree. Budgets can be negotiated and the difference between those deemed visionaries and those deemed fantasists seems to be to me down to success and luck. Luck is luck and for success I refer you to points 1 through 5.

OK, maybe a final rule is:

Never ask for or aim for something that is not realistic.

So, I am now able to develop my team and my application and not expect to be able to spend half the company profit on the fastest box out there, as it is not realistic.

There are a shed load of other things that I think are important to helping you be a good manager, you know, techniques and methods for improving things, but nothing else that is key.

And it’s such a simple, small list even I can aim for it.

The shame of it is that I don’t think it’s enough to be developed into a book or a course so I can’t sell the idea. That and I’ve gone and given it away in this blog. Also, though I feel I can give points 1,2 and 5 a good shot, point 3 is way beyond me…possibly because of point 5… So I am not a great manager.

I’m going to hide behind this stout wall now, with my hard hat on, and wait to be told how naive I am…

We all know that working in a team is more efficient than working on your own (and I did say a week or two back how I was enjoying the rare privilege of working in a team of performance guys). Many of us also know about team dynamics and creating a balanced team of ideas people, completer-finishers, implementers, strategists and so forth. Those of use who have been exposed to training courses or books on team management know all these good things about teams and how we are supposed to get the most out of them.

How many of us, though, have been introduced to the work of the French Agronomist Max Ringelmann and the aspect of teams named after him, the Ringelmann Effect? In summary the Ringelmann Effect proposses that people in teams try less hard than they do when working alone. Especially if they think no one is watching them.

Back at the start of the 20th century Ringelmann tested out his ideas using a tug-of-war experiment. He would get people to pull on a rope as hard as they could and record their efforts using a strain gauge. Then he would get them to pull on the rope as part of a team, from 2 to 8 people. As soon as people were part of a team, they pulled less hard. With two people in the team, each pulled 93% as hard as on their own, with three people this dropped down to 85% and with 4 it was just 77%. By the time there were 8 people in the team, effort was down to 50%.

This idea of shirking work more and more as the team increased in size became established in modern psychology and was given Mr Ringelmann’s name. Psychologists explain that when someone is part of a group effort then the outcome is not solely down to the individual and, as such, is not totally in their control. This acts as a demotivating factor and the person tries that little bit less hard. The larger the team, the greater the demotivation and the more significant the drop in effort. Ringelmann found that effort was down to 50% in a team of 8 so how bad can the impact of the team be? I think most of us have at least witnessed, and quite possibly been in, the position of feeling like just a cog in a massive corporate team machine. Thoroughly demotivating (though, of course, we all of us still tried as hard as we could, didn’t we?).

The effect is also know under the far more entertaining title of Social Loafing.

Monsieur Ringelmann was far kinder at the time and pointed out that these chaps pulling on the rope could well have been suffering from a lack of synergy. They had not been trained together to pull as a team so that could account for the drop in effort, they were not synchronising their effort.

However, in the 1970’s Alan Ingham in Washington University revisited Ringelmanns work and he was far sneekier. Sorry, he was a more rigorous scientist. He used stooges in his team of rope-pullers, blindfolds and putting the one poor person pulling for real at the front of the team pulling the rope. Thus he could record the effort of the individual. Ingham found that there was indeed a drop in efficiency due to the team not pulling as one. But sadly, this was not the main factor. It remained that the drop in effort was mostly down to the perceived size of the rest of the team. The bottom line was proven to be the human capacity to try less hard when part of a team and that the drop in effort was directly proportional to the size of the team.

Don’t give the same job to several people and let them know they all have the same job.

Ask people how they are getting on and give them mini-goals along the way.

Atually reward them for success. Like saying “thank you” and NOT giving them yet another boring, hard job to do as they did the last one so well.

I think it is also a good argument for keeping teams small {I personally think 5 or 6 people is ideal} and split up large projects such that a single team can cope. Then give tasks to individuals or pairs of people.

If you like this sort of thing you might want to check out one of my first blog post (though it is more an angry rant than a true discussion ofthe topic) which was on the Dunning-Kruger effect, where some people are unaware of their own limitations – though I did not know it was called the Dunning-Kruger effect until others told me, which only goes to show that maybe I am not aware of my own limits… Read the comments or click through to the links from there to get a better description of some people’s inability to guage their own inabilities.

I had a few comments when I posted on solid state memory last week and I also had a couple of interesting email discussions with people.

I seriously failed to make much of one of the key advantages of solid-state storage over disk storage, which is the far greater capacity of Input/output operations per second (IOPS), which was picked up by Neil Chandler. Like many people, I have had discussions with the storage guys about why I think the storage is terribly slow and they think it is fast. They look at the total throughput from the storage to the server and tell me it is fine. It is not great ,they say, but it is {let’s say for this example} passing 440MB a second over to the server. That is respectable and I should stop complaining.

The problem is, they are just looking at throughput, which seems to be the main metric they are concerned about after acreage. This is probably not really their fault, it is the way the vendors approach things too. However, my database is just concerned in creating, fetching, and altering records and it does it as input/output operations. Let us say a disk can manage 80 IOPS per second (which allows an average 12.5 ms to both seek to the record and also read the data. Even many modern 7,200 rpm discs struggle to average less than 12ms seek time). We have 130 disks in this example storage array and there is no overhead from any sort of raid or any bottleneck in passing the data back to the server. {This is of course utterly unbelievable, but if i have been a little harsh not stating the discs can manage 8ms seek time, ignoring the raid/hba/network cost covers that}. Each disc is a “small” one of 500GB. They bought cheap disk to give us as many MB/£ as they could {10,000 and 15,0000 rpm disks will manage 120 and 160 IOPS per second but cost more per MB}.

Four sessions on my theoretical database are doing full table scans, 1MB of data per IO {Oracle’s usual max on 10.2}, Each session receiving 100MB of data a second, so 400MB in total. 5 discs {5*80 IOPS*1MB} could supply that level of IOPS. It is a perfect database world and there are no blocks in the cache already for these scans to interrupt the multi-block reads.

However, my system is primarily an OLTP system and the other IO is records being read via index lookups and single block reads or writes.

Each IOP reads the minimum for the database, which is a block. A block is 4k. Oracle can’t read a bit of a block.

Thus the 40MB of other data being transferred from (or to) the storage is single block reads of 4k. 10,000 of them. I will need 10,000/80 disks to support that level of IO. That is 125 discs, running flat out.

So, I am using all my 130 discs and 96% of them are serving 40MB of requests and 4% are serving 400MB of requests. As you can see, as an OLTP database I do not care about acreage or throughput. I want IOPS. I need all those spindles to give me the IOPS I need.

What does the 40MB of requests actually equate to? Let us say our indexes are small and efficient and have a height of 3 (b-level of 2), so root node, one level of branch nodes and then the leaf nodes. To get a row you need to read the root node, branch node, lead node and then the table block. 4 IOs. So those 10,000 IOPS are allowing us to read or write 10,000/4 records a second or 2,500 records.
You can read 2,500 records a second.

Sounds a lot? Well, let us say you are pulling up customer records onto a screen and the main page pulls data from 3 main tables (customer, address, account_summary) and translates 6 fields via lookups. I’ll be kind and say the lookups are tiny and oracle just reads the block or blocks of the table with one IO. So that is 9IOs for the customer screen, so if our 40MB OLTP IO was all for looking up customers then you could show just under 280 customers a second, across all users of your database. If you want to pull up the first screen of the orders summary, each screen record derived from 2 underlying main tables and again half a dozen lookups, but now with 10 records per summary page – that is 80 IOs for the page. Looking at a customer and their order summary you are down to under thirty a second across your whole organisation and doing nothing else.

You get the idea. 2,500 IOPS per second is tiny. Especially as those 130 500GB disks give you 65TB of space to host your database on. Yes, it is potentially a big database.

The only way any of this works is due to the buffer cache. If you have a very healthy buffer cache hit ratio of 99% then you can see that your 2500 records of physical IO coming in and out of the storage sub-system is actually supporting 250,000 logical-and-physical IOPS. {And in reality, many sites not buffer at the application layer too}.

Using Solid State Storage would potentially give you a huge boost in performance for your OLTP system, even if the new technology was used to simply replicate disk storage.

I think you can tell that storage vendors are very aware of this issue as seek time and IOPS is not metrics that tend to jump out of the literature for disk storage. In fact, often it is not mentioned at all. I have just been looking at some modern sales literature and white papers on storage from a couple of vendors and they do not even mention IOPS – but they happily quote acreage and maximum transfer rates. That is, until you get to information on Solid State Discs. NOw, because the vendor can say good things bout the situation then the information is there. On one HP white paper the figures given are:

More and more these days, as a DBA you do not need or want to state your storage requirements in terms of acreage or maximum throughput, you will get those for free, so long as you state your IOPS requirements. Just say “I need 5000 IOPS a second” and let the storage expert find the cheapest, smallest disks they can to provide it. You will have TBs of space.

With solid-state storage you would not need to over-specify storage acreage to get the IOPS, and this is why I said last week that you do not need solid state storage to match the capacity of current disks for this storage to take over. We would be back to the old situation where you buy so many cheap, small units to get the volume, IOPS are almost an accidental by-product. With 1GB discs you were always getting a bulk-buy discount :-)

I said that SSD would boost performance even if you used the technology to replicate the current disk storage. By this I mean that you get a chunk of solid-state disk with a SATA or SAS interface in a 3.5 inch format block and plug it in where a physical disk was plugged in, still sending chunks of 4k or 8k over the network to the Block Buffer Cache. But does Oracle want to stick with the current block paradigm for requesting information and holding data in the block buffer cache? After all, why pass over and hold in memory a block of data when all the user wanted was a specific record? It might be better to hold specific records. I suspect that Oracle will stick with the block-based structure for a while yet as it is so established and key to the kernel, but I would not be at all surprised if something is being developed with exadata in mind where data sets/records are buffered and this could be used for data coming from solid state memory. A second cache where, if using exadata or solid-state memory, holding single records. {I might come back to this in a later blog, this one is already getting bloated}.

This leads on to the physical side of solid-state discs. They currently conform to the 3.5” or 2.5” hard disc form factor but there is no need for them to do so. One friend commented that, with USB memory sticks, you could stick a female port on the back of a memory stick and a joint and you could just daisy-chain the USB sticks into each other, as a long snake. And then decorate your desk with them. Your storage could be looped around the ceiling as bunting. Being serious, though, with solid state storage then you could have racks or rows of chips anywhere in the server box. In something like a laptop the storage could be an array 2mm high across the bottom the chasis. For the server room you could have a 1u “server” and inside it a forest of chips mounted vertically, like row after row of teeth, with a simple fan at front and back to cool the teeth (if needed at all). And, as I said last time, with the solid state being so much smaller and no need to keep to the old hard disk format, you could squeeze a hell of a lot of storage into a standard server box.

If you pulled the storage locally into your server, you would be back to the world of localised storage, but then LANs and WANs are so much faster now that if you had 10TB of storage local to your server, you could probably share it with other machines in the network relatively easily and yet have it available to the local server with as many and as fat a set of internal interfaces as you could get your provider to manage.

I’m going to, at long last, wrap up this current instalment on my thoughts with a business one. I am convinced that soon solid-state storage is going to be so far superior a proposition to traditional disks that demand will explode. And so it won’t get cheaper. I’m wondering if manufacturers will hit a point where they can sell as much as they can easily make and so hold the price higher. After all, what was the argument for Compact Discs to cost twice as much to produce as old cassette tapes, even when they had been available for 5 years? What you can get away with charging for it.