In this post we’ll discuss this term “availability,” as it applies to the data centre world and give some perspective on this notion of the number of “nines.” Availability versus Reliability Let’s first talk about this term, “availability,” and how it is different from the better understood term, “reliability.” Availability is a probability (though a…

‘Remember bell-bottom jeans, platform shoes or puka shell necklaces? I mean, of course, the first time we had bell-bottom jeans – not that silly three month period in the ‘90’s. I had some of those back then. They were ‘groovy’ and I was really ‘digging it’. I don’t dress that way anymore (even though I…

Those following this blog will know that we often discuss data center criticality designations, and often the four tier classification system of the Uptime Institute. One of the issues on which we counsel our Clients is the distinction between Uptime Institute “Tier III” and TIA/942 “tier 3” (as well as the other tier levels, respectively).

While in Frankfurt this week, I attended the Data Center Dynamics Converged event. This was a day-long conference with two halls covering a variety of data center topics across the stack, from basic infrastructure and design for energy efficiency, through business process application methods for data center management.

For some time now, we’ve been writing about how traditional methods of identifying stranded IT assets fall far short, because of the fact that utilization-based metrics do not accurately reflect value returned to the business by the IT asset. Enterprises are living with substantial drag on their IT operations budgets because of unused or underused servers and server software.

The program at Data Center Dynamics Converged, in Dallas, TX this week was full of topics related to the problem of diminishing data center capacity, and the difficulty we experience as data center operators in managing this problem.

The operation of the data center represents the bulk of most Enterprise IT budgets. Most firms will attest that the composition of the annual IT budget is 80% for “keeping the lights on,” and only 20% for “projects” or “innovation.”

The initial Unit Under Test (UUT) for the Wild Server Project has successfully emerged from its first day in the wild. The HP DL 380 G4, named Ashley, is the inaugural test subject for the project, which aims to demonstrate how data center volume servers fare under completely uncontrolled environmental conditions.

The architecture of a building can impact the availability of a data center, regardless of infrastructure topology investments. Building architecture dictates constraints in the design of a data center. Seldom do we have a nice, rectangular box, with easement in all six adjacent directions. For urban data centers, this issue is especially apparent. It would…