How to Kill a Thousand Data Centers

The U.S. federal government is shutting down more than 1,000 data centers, and enterprise managers can learn valuable lessons from what’s gone right and what’s gone wrong.

The goal of the Federal Data Center Consolidation Initiative (FDCCI) is to close 1,196 of the government’s 2,900 data centers by 2015, with the hopes of realizing $2.4 billion in cost savings, according to a July 2012 report issued by the U.S. Government Accountability Office (GAO). In addition to cutting costs, the program aims to promote the use of green IT by reducing energy consumption and the government’s data center footprint.

All 24 federal government agencies that are members of the Federal CIO Council are participating in the program, with mixed success. A July 2012 report issued by the GAO concluded that much progress is being made, yet work still remains to be done in the areas of inventories and planning.

Mixed success

Having worked on the original FDCCI guidelines as a contractor, Nikolay Bakaltchev, now a principal in CSC’s federal consulting practice, has insights into the program’s inner workings. He agrees with the GAO’s assessment that the project is moving in the right direction, and adds, “I have no doubt it will reach its goals, but the question is more on the schedule and the pace of it. Still, I think the direction is clear, and eventually FDCCI will meet its objectives.”

“By shutting down and consolidating underperforming data centers and optimizing the data centers in our federal inventory, we stand to save taxpayers billions of dollars and curb spending on underutilized infrastructure,” VanRoekel says. “This means a shift from a model that risks procuring duplicative and wasteful infrastructure that utilizes only a fraction of the computing power purchased to a newer model, where that risk is reduced as the government purchases IT infrastructure as a service, deployed in a scalable and rapid fashion.”

The government intends to close 525 data centers by the end of 2012, according to the GAO report. Isolated success stories have been reported at sub-agencies such as the Bureau of Indian Affairs, which used virtualization technology to close 11 data centers within one fiscal year. Still, there are reports that some agencies are struggling with taking accurate asset inventories.

Bakaltchev says enterprises undergoing a data center consolidation, or any other substantial IT project that involves facilities and infrastructure, need to get back to the basics before doing anything else. This starts with using automation to nail down exactly what you have in your inventories. “Unfortunately, it’s not easy to track equipment and applications, with all the ongoing changes to servers, storage and networks,” Bakaltchev says.

“Still, you need to use automation to discover your inventory, identify what you have, determine what applications are running on it and their dependencies, and clarify what your strategy is for moving those applications.”

Improved efficiency

Facilities management is also a key issue that needs to be addressed. Relatively young technology companies such as Google and Facebook host massive data centers that are energy efficient because they were designed and built fairly recently.

Conversely, many of the federal government data centers were built decades ago or were even converted from legacy office buildings, so they are quite inefficient in terms of energy use.

Typically federal agencies are responsible for complex mission applications that have multiple sources feeding information into the data center and multiple client organizations consuming that information. This makes the dependencies of some applications on specific server, storage and network configuration quite complex. So Bakaltchev says a major challenge is for managers to successfully map these complex application dependencies and to plan any consolidation accordingly, while ensuring no disruption in the quality of mission services provided by their agency.

The GAO report found that dealing with cultural change is also posing a problem at some agencies. There are employees who are simply resistant to change. Bakaltchev says the No. 1 challenge from a business standpoint is for the organization to be the driver of the consolidation, and typically that would be led by the agency’s CIO.

“The key factor is to define the processes in a win-win arrangement, meaning that it wouldn’t work well as a top-down command structure, but it might work much better by using data center consolidation to design and deliver new services to the CIO’s constituents in a shared services offering,” Bakaltchev says. “It’s like the CIO is offering new high-quality shared services to the organization and providing cost incentives for its components to move to centralized, new data centers.”

Related Content

Shared Services for the Public Sector
Shared services make sense for the public sector. Managers in government agencies want to complete their missions effectively while controlling costs and reducing risk. These are precisely the tasks for which shared services offer an ideal solution.

Big Data, Bigger Potential
Once you get your mind around just how big our data has become, the next trick is to open your mind to the possibilities created by having all that data. The industry term is Big Data, but it really ought to be called Huge Data or perhaps Ginormous Data.