Email a friend

To

From

Thank you

Your message has been sent.

Sorry

There was an error emailing this page.

The real cost of lying about IT infrastructure costs

FREE

Become An Insider

Sign up now and get free access to
hundreds of Insider articles, guides, reviews, interviews, blogs, and other premium content
from the best tech brands on the Internet: CIO, CSO, Computerworld, InfoWorld,
IT World and Network World Learn more.

Other Insider Recommendations

Are you hiding the truth from management to justify the money you need to operate your infrastructure? That's a dangerous game

InfoWorld|Apr 19, 2010

It's hard enough to build a solid primary storage strategy for a complex server environment, but that's nothing compared to the effort required to get your proposal through your organization's budgeting process. A refreshed storage architecture or a new backup infrastructure requires loads of capital, which generally means you have to get creative when you articulate requirements to management.

Yes, some sort of song and dance is almost always required. On the other hand, part of the problem is that we seldom do a good job educating management about how our systems live and breathe and what they actually cost to run. Obfuscation isn't in anyone's interest in the long run -- especially as the cloud matures and becomes a viable alternative to some segments of in-house infrastructure.

One reason this disconnect occurs is that management fails to grasp the relationship between the cost and criticality of a given application and its actual technical requirements -- storage or otherwise. Time and time again I have seen huge, massively expensive, mission-critical applications sit on equally huge and massively expensive storage infrastructures that are essentially idling under low load. Meanwhile, elsewhere in the data center, a collection of cheap tertiary applications are burning their storage resources to a crisp with loads many times that of the critical systems.

The reason this happens can usually be traced back to the budget process. When management is told that a mission-critical application will cost millions of dollars to acquire and implement, there's rarely much in the way of pushback over a beefy infrastructure to match. In fact, it's expected.

Conversely, companies often purchase $5,000 or $10,000 applications without giving much thought to the effect they'll have on infrastructure. Yet if you do your homework and thoroughly test these applications up front, you'll often find that these "little" applications use far more resources than the "big" ones do. While mangement may not blink at a request for $100,000 in storage capital for a $750,000 software project, you can expect a ton of resistance if you're looking for $50,000 of storage capital for a $10,000 software project.

That's how the budgetary two-step was invented. IT departments everywhere use big ticket projects to justify the purchase of infrastructure those projects don't need -- in order to feed the needs of little projects that nobody would spend enough capital on otherwise.

Is that really such a bad thing, you ask? Management feels it has spent its money wisely, while IT gets the resources it needs to deliver solid performance and reliability back to the business. It's almost as if IT is doing management a favor by quietly doing the right thing -- seems like a win-win.

Unfortunately, this kind of budgetary shell game has serious drawbacks. By obscuring the "true" cost of running your infrastructure, you perpetuate a decision-making process that consistently yields poor, uninformed choices. It becomes all but impossible to adopt a chargeback system that reflects the actual cost of what IT does. And it feeds into the constant struggle between IT and management over what's truly worth spending money on -- especially when it comes to allocating manpower to manage infrastructure and applications. When you get right down to it, if you constantly feel your department is overworked and is never given enough time to finish anything, look in the mirror. Being less than brutally honest about what things actually cost probably caused that situation.

Of course, the alternative to obfuscation is not particularly pleasant, either. It takes time, effort, and a thick skin to disclose the real cost of running an IT infrastructure as it relates to applications and services. But if you do it right and management is receptive, far fewer lamentable purchasing decisions will be made.

Moreover, as the pay-as-you-go cloud matures, the budgetary shell game will come to a halt. If managememnt doesn't know what things actually cost to run, it may decide to outsource big, mission-critical applications, leaving you with peanuts to run the rest of your infrastructure -- and a whole lot of explaining to do.