Virtualization is on the rise as organizations combat data center sprawl

There are few data centers around that haven’t implemented some level of virtualization. Organizations are turning to virtualization to help reduce costs, free floor space, and save on power and cooling.

For most organizations, the first parts of the data center to be virtualized are servers and storage. According to Tier1 Research, about 70 percent of organizations will have virtualized servers in their data centers by 2013 thanks to technology maturation and market acceptance. Storage virtualization is also growing exponentially. Virtualizing storage combines all drives into one centrally managed resource and allows for more consistent management.

But that’s only the tip of the iceberg. Now that server and storage virtualization have become more or less commonplace in the data center, organizations are hoping to gain similar benefits through other types of virtualization — virtualization of network components and other data center components.

“Virtualization is the catalyst that will allow the data center to be seen as a system instead of individual parts,” said Dave Ryan, chief technology officer and vice president of General Dynamics Information Technology's Navy and Air Force division. “In five years, you won’t look at a server and wonder how much capacity it has or wonder how much storage is in a single storage-area network. Instead, you’ll look at the entire data center as a comprehensive system. It’s a technical bridge we’re still crossing, but virtualization is what will enable it.”

The idea of the data center as a system is one that has many vendors excited. It’s not easy because it requires not only significant virtualization but also coordination of everything in the data center — applications, servers, networks and storage — in a fully automated way.

Jason Schafer, a senior analyst at Tier1 Research, said the industry is getting there.

“Within three to six years, the virtualized/hypervisor layer of the data center will be integrated with the physical/power and cooling layers,” he said. “When you bring them together, you get a level of intelligence and functionality that you wouldn’t otherwise have.”

Part of what will enable this coordination is something called data center infrastructure management (DCIM) — a system of software, hardware and sensors that, when implemented, creates the ability to monitor and manage the data center environment in real time through aggregated metrics, analytics, controls and automation.

Take the simple example of a belt failing in an air conditioner. The air conditioner senses the failure and communicates the loss of cooling in that sector of the data center. The system could then migrate the load of the virtual machines in that sector to another sector until the issue is resolved.

“It will make the data center as a whole a machine, rather than a bunch of individual parts,” Schafer said. “And you can’t have that level of sophistication without virtualization.”

In addition to more work that has to be done at the technology level, there is one big problem to be solved: the physical data center itself. Traditionally, the solution to the rising demand for resources was simply to expand data center capacity. However, it can take two years or more to build such a facility, and time is money.

One way to reduce time and cost is to include prefabricated modular components, such as the power and cooling infrastructure. Schafer said modularization at this level can reduce data center build time from two years down to four to six months.

The next step is installing the analytics necessary to provide data center managers with automation and intelligence operations and controls. Along with that comes trending analysis, which allow managers to see peaks and growth rates over time, which in turn allows them to do a better job planning for future capacity.

“You can easily see, for example, that over the past three months you have increased capacity by X amount. If you maintain that same growth rate, you can easily determine that you will be over capacity in nine months,” Schafer said. “They can use that intelligence to help stay ahead of the game.”

In the end, the goal is to create more agile, flexible and scalable data centers that function as on-demand environments. By using virtualization and next-generation data center automation, along with eliminating bottlenecks such as the physical data center and lack of analytics, that goal might only be a few years away.

About this Report

This special report was commissioned by the Content Solutions unit, an independent editorial arm of 1105 Government Information Group. Specific topics are chosen in response to interest from the vendor community; however, sponsors are not guaranteed content contribution or review of content before publication. For more information about 1105 Government Information Group Content Solutions, please email us at GIGCustomMedia@1105govinfo.com