August 2011

16 August 2011

In 1999, during the dot com boom, if you could even describe something plausible you could get people to invest in it – both their money and their minds. I was no different.

At one point, I worked for a company that was creating an in-vehicle computing platform. The concept was that your PDA (at the time this would be their Palm Pilots) could plug into a cradle in your car and, as you drove, magic would happen. You could get stock market reports (this was very important then), maps, weather reports, and your e-mail.

They recruited me away from a job I could have had for life by telling me wonderful things about my new professional life in this bold new company. The CEO had grown and sold two other start ups and this was going to be his third home-run. The CTO was a whiz with a golden touch.

I was being hired to be the liaison between this company and local or state governments – basically to build the first central repository of real-time traffic data in the US. There was tremendous upside and absolutely no risk. What could possibly go wrong?

Weeks went by at my new company. The pace was frenetic. Everyone talked about how amazing the product was. Daily system architecture meetings were filled with manic white-board scribbling. I was furiously working away on my part. We had drawings, schematics, mock-ups.

Today, the magic they were trying to launch has been fairly well implemented. The goal was to say things to your car like, “Show me the best way to get home and avoid traffic.” or “Get me to the nearest coffee shop, but it needs wifi.” Today, 12 years later, you can give some simple voice commands. In 2000, this tech simply did not exist – but no one told the workers at my startup that.

We had planned to launch our wonder-device at the end of the year. It was nearing Thanksgiving. They kept promising to show the device to me and others in the company. Excitement was growing. We were all ready to put these things in our cars.

Finally, we had a demonstration of our device. I walked up to it and commanded “show me traffic for Seattle.” Nothing happened. Someone next to me said, “It only works for his voice.” and pointed to the person next to me.

That man then said, “Up … Up … Select …. Up ….. right … select” ….. “SELECT!!!” … “SELECT!!!!!”. When it didn’t select, he pushed the reset button. “It’s not fully trained,” he said.

The system could barely understand simple commands to navigate menus in a quiet room, let alone complex voice commands in a noisy car. Rather than parse complex sentences, this had to be trained for a handful of common words. I had been lied to.

The shock was too great. I completely lost my composure. “THAT’S IT?!” There was no way this system could ever be ready to do anything at all. It was utter garbage.

No one else in my team could understand why I was upset. They looked at the barely functional prototype and insisted it would become market ready in a little over a month.

When I started working for the company there was about 60 of us. At the height of my time there, we had over 100 employees. When I was laid off we had slightly more than 25. As people left, they were sad that they would not be able to see the awesome launch at the end of the year.

I was there 5 weeks.

The good professionals at this company were victims of several cognitive biases. Let’s examine a few.

The Bandwagon Effect is basically groupthink. People in a group that see others believing something tend to give their compatriots the benefit of the doubt and, soon enough, begin believing it themselves. In this case, people, including myself, assumed that the management of the company was truthful. On that, everyone built up their expectations of what was possible – even though most of them must have known on some level that the technology of the day would not support what was being promised.

What follows groupthink is the institutionalization of the group thought. The Availability Cascade is a system where a group’s collective belief (in this case a car you could talk to) gains credibility as people repeat the promise. Everyone at the company was caught neck-deep in the availability cascade.

And oddly enough, simple old Wishful Thinking plays a role here. The people at my company had been steeped in years of dot com fairy tales. We all would have jumped on a bandwagon and promoted a cascade that promised us each millions.

When we are planning, we are creating expectations. This is normal, expected, and even positive. But as we move forward in projects, we as actors really want them to succeed. The opportunities for the Bandwagon Effect, Availability Cascade, and Wishful Thinking to sway our decision making are many and can be insidious.

15 August 2011

In Freakonomics, Steven Levit tells a story of a day care center in Israel that was tired of parents being late to pick up their kids. So, they turned to a market solution: they charged a penalty fee if you were late to pick up your kids.

But rather than lowering the number of late parents, the number actually shot up. Why? Because before there was a social contract that said, “If you are late to pick up your children, you are causing others inconvenience and you are bad.” But if there is a fee, there is now a market contract that says, “It costs you $5 an hour to park your kids here.”

Parents with late kids were no longer bad, they were just taking advantage of a great new service by the day care. That was not the intent of the day care, they wanted less kids after 5 pm. Now they had more. Why?

This is The Framing Effect in action. The Framing Effect says that the wording or the context in which options are presented directly impact (or frame) our selections. In this case, leaving your kids at day care late moved from a social frame to a transactional or economic frame. Parents now judged base on a cost-benefit analysis (Is it worth $5 to run this errand and know my kids are safe?) The fine was seen as a fee.

When we make decisions, the Framing Effect is good in some ways (can provide context, background, etc.) but bad in other ways (today’s context may not be tomorrow’s). Decisions that are made today are therefore at risk of being overly framed by current contexts or biases and not looking out into the future or too far into the past.

Frames themselves are the objects that make up our world view. They can be experiential (social, personal, economic, political, etc) and they can be emotional (based on past experiences, joys, traumas, hopes, fears, etc). If you have been in a situation where a large mean dog attacked you, your frame may from that point on include a fear of large dogs – even if they are friendly.

With this deeper understanding, we can see that frames can be imposed by other people, but also by our own experiences. We use frames to filter our environment, to quickly interpret context, and to formulate appropriate responses.

When we have a team in an office setting making decisions, we are dealing with framing from many angles. First, the information they are receiving may have been presented to them in a framed way. Like the difference in these two sentences:

Work bogs down at Charlie’s desk.

as opposed to

Work bogs down when it reaches the QA stage.

Charlie is the QA guy. On the surface, both of these sentences say the same thing – work is bogging down when it reaches QA / Charlie. But when we read the first sentence, we are reading the problem as Charlie. When we read the second sentence, the problem is workflow at the QA stage which involves Charlie, but may not necessarily be due to Charlie.

The Frame (Charlie v QA) (personal v functional) has a direct impact on how people initially process the problem and begin to devise solutions.

10 August 2011

My colleague and I were four days into a field exercise for an environmental impact statement or EIS. An EIS is a document every public project must complete to show what the short and long-term impacts would be if it were completed. They tend to be large and expensive. This particular project was very large and the impacts many.

Given that the project was underway and had a definite deadline, no one was willing to slow it down or change its scope. This particular task was supposed to take the two of us about a week to complete.

Four days into this particular task, we found the data we had been given was poor, the people to contact with better data were unavailable, and that the massaging of the new data we couldn’t even get would take weeks.

When we presented the managers of the project with this information, we were told Don’t Slow Down, Just Get It Done.

Get it done with what?

The data you already have.

We were then expressly told not to contact anyone else or extend this particular task.

Back then, we were completely confused as to why anyone would be so totally stupid. How could these managers knowingly ignore that we were using faulty information that would result in a useless report? These weren’t Dilbert characters, they were real live human beings.

Over the years I would watch other seemingly smart people drive their high-tech startups into the ground by insisting that an initial plan or idea go forward despite ample evidence that all that lie ahead was ruin.

What cognitive biases kicked-in to destroy these endeavors?

Irrational Escalation – This is in-for-a-penny-in-for-a-pound-ism. Once people start on a project, they find it very hard to stop – even if the project is obviously doomed or the return will not cover future costs. The managers felt that the project was already underway and even using bad data was better than delaying the project and asking for new data. The bad data (not surprisingly) later resulted in obviously flawed analysis – causing even more expensive and stressful work later in the project.

Expectation Bias – This is a world-view bias. If we believe something, we pick data points that support that expectation. Further, we expect that most data will support our expectations. We had an opportunity to create a research regimen that would yield scientific results. The managers assumed this was all a formality and that any data (good or bad) would support the original plan. This was likely underscored by “experience” that showed that most data did indeed support the original plans (which were likely suffering from expectation bias).

Planning Fallacy – A central issue here is simply that we didn’t have enough time to finish our task. The expectation bias inherent in the people who wrote the initial scope gave us only a week. This led to them, and us, falling prey to the Planning Fallacy which shows that human beings have a strong tendency to underestimate task completion times. On the ground, we saw this playing out. From the manager’s desk, the plan was correct, we were at fault.

Status Quo Bias – This bias says that once we start something, we tend to want it to continue. This is why in Collaborative Management we seek to build an explicit Culture of Continuous Improvement for teams and companies. Making change and improvement the center of the culture makes change the status quo. Needless to say, this wasn’t the case with my project back then.

The upshot here is that biases can gang up on us. Our enthusiasm, belief in ourselves, and assumption that things stay pretty much the same can combine to create seemingly logical decisions that can turn deadly. My managers no doubt reasoned that the project plan took over a year to write by well-informed professionals. There was very little cause for concern that a small part of the project might derail the whole thing.

In the end, our re-work was hardly noticed because other parts of the project became late, with much higher costs. One can only guess that the same forces were at work.

01 August 2011

Because things are the way they are, things will not stay the way they are. ~ Bertolt Brecht

Billie runs a shop of 24 workers that have a maintenance contract with a local school system. Over the summer, the shop has been busy, but not overwhelmed. They’ve been making routine repairs at schools around town. The crew is happy.

Billie knows that in one month school will start. The unnoticed maintenance needs over the summer and the impending assault by the returning students are going to drive demand through the roof. This change will not only result in the obvious – a very busy time for the staff – but also in the unexpected. Staff stress will go up, job satisfaction will go down, tempers will flare causing unexpected interpersonal issues, some number of people will be injured, and two or three “really weird” things will happen.

Billie and his team have to be prepared for added work load – which standard work processes can handle. But they also have to be prepared for the human emotional toll and the rise of things that will only present themselves when they do. They don’t know what those things will be – they just have to be ready for something.

We have two choices with change. We can ignore it or we can embrace it. We can not eradicate it. We cannot tame it. Change is both external and internal. We change, markets change, the weather changes, politics change. Everything migrates, shifts, and evolves.

While there is much in these changes we cannot foresee, there is much we can accurately anticipate. Billie knows that tempers will flare and that there will be interpersonal blow-ups. They will always be petty. They will always make him want to shout, “Seriously? This is what you’re wasting my time on?” But Billie can see that its because of stress and change. When these flare ups will happen is a mystery. That they will happen is a certainty.

The two or three really weird things can range from the minor-but-time-consuming (there’s a family of angry cats stuck in the HVAC system at Bernie Madoff Elementary School) to the horrifying (which we really don’t need to envision here). These may or may not happen.

Having no tolerance for change as a manager means that every event not covered by process becomes a traumatic event for all involved. Process cannot deal with the daily operational change, therefore the change is a crisis. That trauma makes people less able to effectively respond to the change when it occurs.

Change before you have to. ~ Jack Welch

Managers need to respond to these types of operational change daily. I’ve never met anyone who disagrees with this. But what I often find is managers wanting to respond to these events with more process. There is no reason after cats appear in an HVAC to create a “Cats in HVAC” rule set. The event itself is unimportant.

What was important was:

How quickly was the issue responded to?

How quickly did the worker or team devise a fix?

Did the worker or team ask for help when they needed it?

Were the appropriate people informed of the situation?

How was other scheduled work impacted by this unforeseen event?

In essence, what impacts did this change from the status quo have on the crew, the schedule and the customer? Did Billie and his team have a system in place that could elegantly respond to change? Were there existing processes or procedures that slowed response or made it less effective? Was the response to change to freak out and lose control of schedule, communication, and the crisis itself? Or was the response to change to escalate the new event, inform those downstream of the delay, and move forward?

Note: This post is about operational change … there will be others about organizational, market or other types of change.