The Art of Management in Outsourcing

The first time Sourcingmag.com talked with CEO Binod Taterway last year, his company had just announced 4pi, a set of solutions for helping companies manage the whole outsourcing lifecycle. (It stands for "fourth-party integrator.") The goal of the approach is to provide clarity in the outsourcing engagement and, as a result, to improve service performance. Every consulting firm claims that, right? Well, what Mr. Taterway has is the backing of Gartner Consulting, which formed an alliance with Blue Canopy to provide its service offering, particularly in outsourcing jobs where multiple service providers are involved. We spoke with him about some of the realities of outsourcing management. Along the way, he explained how to gain visibility into the performance of your project, what form a performance dashboard should take and why SLAs can deceive you into complacency.

Your company talks about the concept of the last mile. I'm familiar with that term on the telecom side, but I haven't heard it applied to sourcing work. Can you explain?

Binod Taterway:

A lot of organizations have challenges to do that last mile, because the last mile requires an equal amount of effort to do the last 10% as doing the first 90%. This concept is not very dissimilar in manufacturing. Manufacturing talks a lot about supply chain visibility. I come from a retail manufacturing background. To be a Wal-Mart supplier, you have to do ASN, "advanced shipping notice." You have to do x-y-z. And if you don't do that, then you are not a Wal-Mart supplier.

It is the same as BlueCanopy saying, if you want to become a [service] supplier, you really need to be able to do x-y-z. You need to be able to provide visibility, because visibility essentially implies productivity. We tell our vendors that you don't have to make your enterprise visible, so long as you meet your SLAs 100% of the time, every time. But the reality of the situation is different. If there is an indication that certain deliverables and certain SLAs are going to be missed, I [as client] want to know ahead of time, because I might have a negation strategy.

So, I think from that perspective, we talk about being a last mile partner. We want to take that last 10% that gives the clients — the CIOs — 90% of the grief. I had a client in Florida that said, "I am doing an Oracle 11i implementation, and I have 50 people working on it. I have no visibility, come October 31, whether I am going to be done or not. If I am done, I will be a star. If I fail, then I am going to be fired." That black box visibility is impossible to deal with in today's environment.

How come that client didn't have that visibility? What was she lacking?

It was an outsourced project. She inherited that. She did not have as much latitude as we guide our clients to implement — 3% to 12% of overhead in terms of management overhead. So she was constrained. To her, it's like Russian roulette. Hit or miss.

I think many organizations are realizing that it is not acceptable. We need some predictability, and that varies. In the case of Wal-Mart, they want to see the entire supply base all the way down to the guy who is manufacturing something.

In service supply chain, I think that we will eventually get to that. But we still have a long ways to [go in] thinking about service as a supply chain. Service includes a number of multiple participants working collaboratively, to deliver a service. So, that is why we are called the last mile partner. We are not here to manage vendors invasively. One of the key issues is that you want to utilize the process. You want to utilize their world-class capabilities. You want to have the visibility, so that you can manage, you can predict, and you can make decisions.

Another concept you use is the "glass pipeline."

Many people have a black box visibility. I was telling you about this CIO in Florida. They have no visibility, and it is not streamlined. Streamlined implies that it is repeatable, it is something that you can do with multiple vendors. However, full visibility essentially implies from the client's perspective. The client may have visibility from vendor one to the client, vendor two to the client, and vendor three to the client, but there are increasing dependencies between vendors one, two and three. Are they visible to each other? Does vendor two know that my upstream partner is vendor one, and my downstream partner is vendor three? If certain things happen in vendor one, maybe the client knows, but vendors two and three don't know directly.

So we talk about the glass pipeline vendors, where everyone has a visibility. It makes no distinction between organizational boundaries… When we work together, we are essentially creating an enterprise where we have everything so we can adapt. The glass pipeline typically talks about end to end visibility among all participants. Only then would you be able to work lockstep.

Even within an enterprise – forget about outside enterprises — even within an enterprise, the challenging part is that organizations are people. Service is a people business, and they don't create visibility around themselves. People are the biggest black box.

The sourcing business is to some extent a very people-intense business. People overlook HR and they look at that as a downstream process: "We are going to define it and you are going to execute that." To some extent, dislocation of work requires a lot of HR involvement. People don't look at HR, just as we don't worry about electricity. I think to some extent that when people think about the sourcing, they don't think about people. They think about SLAs, they think about contracts, they think about a robotic arrangement between the service provider and the service recipient. If you treat it that way, there is a lot of room for confusion, which then requires you to put [in] an incredible amount of people to manage that.

The other aspect that we haven't talked about is the data. Service providers always claim that they are meeting their SLAs. Service recipients say, "No, you are not meeting your SLAs. Here are my reports." "Well, here are my reports." How do you resolve that? You are not going to be able to manage people by arm's length robotic movements. You have to include that side. That is one thing that we do.

Many outsourcing contracts are defined at a very high level, for instance a CIO level, VP level, sometimes CEO level. Then you have people in the trenches that have to make it work. So unless there is somebody that focuses on that people aspect of it, individual collaboration, cooperation, why you should work together, it doesn't get done. It is doomed to failure.

I speak from experience from working at a very large organization like General Motors. I spent about three in a half years there, where deals were made with the CIO, at very high levels, but at the end, it really comes down to that guy when I pick up the phone and he answers it. I have the trust relationship.

The control portion of it is if there is a variance from that behavior. First, you have to be able to measure it, and apply those controls. It is both the carrot and the stick. Many outsourcers just have the stick; the clients just utilize the stick. After a while it becomes like my two boys. You can arbitrate a little bit, but they go back and do the same thing when you are not watching. So what is the incentive to cooperate and collaborate? You have to worry about it, and that is what we do.

How do you do it?

We call it a roach problem. How do you solve a roach problem? This is one of my clients' statements, not my statement. He said, you shed light on it. If you shed light on a roach, it goes away. So what you do is you understand where the inefficiencies and friction points are and you shed light on it. By collecting objective data — not subjective data — that everybody agrees on, and that is what we call operational data. After you have that, you correlate that, and nobody can dispute that. You put things in a spotlight, and obviously you use that with trust and control to be able to say, "Look. I trust you. Go help yourself." Or, "You are tied to the SLAs. You'd better do it or else I am going to fire you." But you have to have the right data.

One of our clients has nothing but a wonderful dashboard. They do report after report after report. They don't have reliable data. So here is the situation. They get a report that here is a vendor and he is not complying with the SLAs. They go sit down with the vendor and realize that their data is wrong. So they can't use the stick, either, because they don't have reliable data.

I have seen many vendors who have the best intentions to be very successful, but the clients don't create that involvement. They change the requirements all the time. They keep saying you have to work until 12 o'clock. Everybody can pull for a certain time, but after a while, it becomes a drain. We focus on all aspects of that. Become the expeditor. Utilize very experienced people, people-skilled people. Then utilize the tools for creating dashboards and metrics. If the client is at fault, we make it known. If the vendor is at fault, we also make it known.

I think the last thing is to involve vendors in your sourcing management office. It is really important to not have a sourcing management office that just represents you. You have to include vendors, the senior leadership, the [project management office], so that you have an integrated view of all that is at hand.

What should the performance dashboard look like?

You think about it logically. You think about it as a tool to aid in management decisions. It does not have to be electronic. One of our clients, AOL, does it on a weekly basis. They generate reports, and a weekly cycle is good enough for them. They print their dashboard and they have these indicators. They also know if they go real time, they can just go get that. It is not to be a blinking dashboard like your car's dashboard. People think that everything is like that.

Maybe it is like that for network monitoring, for hosting the servers up and down. It is certainly not needed in my opinion, for call centers, help desk. If you have the ability to get daily data, if you have the ability to generate reports on demand, then that is your dashboard, because those things will aid your decision making.

The client I talked about that didn't have the right data, they didn't have a dashboard in a Web-based sense. But they have an incredible enterprise store. They have operational data. They can generate reports out of it any second, any instance. So, they are about to call a vendor; they run a report, they get all the information they need. That is the dashboard. Your decisions are based on it. Obviously it is more cool and sexy to have Web-based, the red, yellow or green indicator, which is a good ambition to have. I think it is a useful thing, but I don't think you should think about dashboards from that perspective. It really depends upon what is the timeframe, what is the decision cycle of the client. In a more mature relationship, you don't even need it on a daily basis. It shouldn't be a monthly basis. It shouldn't be after the fact. If your cycle time is much shorter, it shouldn't be years later.

You suggest monitoring three items. Why limit it to three categories?

Well, the three categories are very broad. I think everybody knows black box metrics. If you are a claims processing system, like insurance: How many claims do you process? Maybe your SLA says that you have to process 200 claims. You have a metric that shows on a daily basis how many you claim. So, that is your SLA. We call it your overall responsibility. Now, take a deep dive. You did 225 calls. That looks pretty good on paper, because you were only required to do 200. How did you do in those 225? Out of 225, you managed 90 percent of that in a plus or minus 10% timeframe. Let's suppose every claim is supposed to take two minutes. You did pretty good. But maybe the remaining 10% took three or four times. That gives you a breakdown. It is important to know that. If I scale tomorrow, 2,000 claims per day, I need to know what my infrastructure, what my providers can do. Are they utilizing people to do that [vs. processes or technology]? I probably won't be able to scale. That is key to decision making.

There's also the content aspect of it. Just today, I called a bank to pay my home equity loan. I know that if don't pay today, I will be late. I called this guy and he answered the call. I said certain things and he turned the call to someone else. His SLA probably shows pretty efficient, because he took the call and he finished that in 30 seconds, and he probably said the call was complete. It was neither checked by a quality metric, because he obviously completed that, it wasn't checked by process, because he stipulated nothing less than 30 seconds. But with that, I had to call back again. It was the wrong department they transferred me to, so I called back again, and that call took much longer, because I had issues with other things.

So how do you measure that in that particular case? You have to be able to measure what we call the content metric. What was the quality of service? To some extent, the quality of service could be out of the 90% of the claims you process, how many come back, and how many do you have to spend more time on? The first is the black box metrics; the last two are what we call the white box metrics. You have to be a little bit more invasive and understanding from the service provider's perspective. Are they following the right process? Are they giving the right deliverables? You might be measuring one thing. Since people's behavior is driven by those SLAs, they might be short circuiting that. Complete the call. Complete the call. And the calls increase. Maybe you are doing great in [terms of the] SLA, but your clients are suffering.

How difficult is it for companies to derive what their metrics should be?

It is very hard. That is why we don't suggest that you do it one time. Our process is, you start with an SLA and define a few metrics. How would you want to manage and measure it? Then kind of hone in on that. You have to focus on where the problems are. Then apply the right measures to make the right part of the enterprise visible, so that you can come back and solve it. That is the hardest part. What are you measuring and a lot of negotiation to the vendors: "Well, I don't have the data for it." "Why don't you have data for it?" "Because you are aggregating that." We spend a lot of time with vendors, kind of focusing on those areas. In some cases, if they are not automated, we [use] people to get the data.

Isn't the quality of the project specific to the client's situation as well?

Network infrastructure, hosting, network centers, infrastructures, the basics, any data center, you should have a standardized way of management. You also have tools, like HP OpenView. CPU is up and down, the network is up and down.

It becomes more difficult as you become more people centered. Move up to application development. It becomes a little bit more complex, because the interpretation of requirements is different. How do you manage the content metric? It is standardized, but it is a continuum. You want to apply standardized methodology to an extent, but there is no way to bypass people from managing people.

I think that is where it comes down to. You want to arm yourself with a tool, and as the client, you want to be the integrator of the integrator. You want to be the one to have the most knowledge about what is going on, so you can effect change. So, it is more about information to effect change and to effect quality than information to not pay or fire a vendor. That is a difficult thing.

I associate that with the problem we have in a sales force. Sales people don't like to give metrics. You try to measure how many calls they make, and they will revolt. Vendors have challenges. They really need to feel warm and comfortable, and [understand] how it is going to benefit them too. The glass pipeline visibility is for them too. So it's trying to move things along; the last 10 percent that doesn't get done.

Of the 3% to 12% of fees applied to management of outsourcing engagements, what percent would it take to hire BlueCanopy?

I think if it were build and operate, not transfer, I would stipulate 15 to 20 percent. I think it is higher on the implementation side. If they don't invest in us, they need to invest in building the capabilities so that they can continuously work on that.

What we are providing through a multi-client study, essentially, is, how does your performance compare to your competitors on an aggregated basis? So you have moved from point A to point in 18 months, let's say, and you may be very happy with it. But maybe your competitors have moved from point A to point C much faster and at a higher level. What is the key root causes for those?

So we baseline operational parameters against organizational practices.

Any last thoughts?

First, outsourcing is not a new concept. Internal organizations are in my opinion, the table stakes. Internal organizations still struggle to survive. An organization can derail a collaborative effort. You have your problems with this person and you don't share the data, and these things get compounded in large outsourcing [projects]. When you go to India, those guys know that they can say crap about anything, and you are not going to fly over there and wring their neck. They know that. They have two responses: "Yes, I'll get it done." "Yes, get out of my sight." You need to understand the dynamics of that. It is an upfront investment in people, an alignment of goals, all those things become very important.

Second, we strive to tell people not to wear any vendor badges. That is what we did at General Motors. We said to IBM and EDS, "Your success depends on us and your failure depends on all of us. What do you want to be part of? So, let's work together. High level executives did a deal; they made a sandbox for us." So I think we talk about a lot of that to sort of get that relationship off the ground.

Lastly, get objective data. It is the biggest tool that a client can have. Focus on that. Spend time on it. A performance dashboard is nothing but interpretation of that data, whether Excel, Web-based or PDA. If you have the right decision making data, then you can effect change.