Related Links

Networked-attached storage devices are a common tool for saving files throughout the government. NAS’ objective is to streamline storage. Instead of adding storage to every server, NAS lets multiple servers share storage provided by NAS appliances connected to the network. NAS devices generally support several file-sharing protocols, so people can more easily access their files.

But NAS can create administrative problems. In time, organizations may accumulate multiple NAS units from different manufacturers, or they may own so many systems from one vendor that they must establish several separate file systems. To address those management challenges, vendors have created NAS virtualization products.

The products give users a unified and stable view of where the systems store files, even though the files may be physically located on different vendors’ storage systems and transferred as needed.

Specialized companies such as Acopia Networks, BlueArc and NeoPath Networks offer NAS virtualization. Joining them are NAS device manufacturers. For example, EMC offers its Rainfinity storage virtualization product, which it acquired in 2005. Network Appliance (NetApp) offers its
V-Series virtualization platform and Virtual File Manager products. The typical product offering is an appliance that attaches to the network between the users and the NAS devices.

Those vendors seek to do more than tame mixed NAS environments with their virtualization wares. Industry executives say their products can also ease data migration and provide a steppingstone to tiered storage, which involves placing data on the most cost-effective layer of storage.

But as promising as the technology appears to be and with some vendors reporting only a handful of government sales, anecdotal evidence suggests that agencies are not flocking to NAS virtualization.

Douglas Hughes, a service engineer who helps run a storage service at NASA’s Jet Propulsion Laboratory, said he hasn’t looked at storage virtualization recently. JPL Information Services’ storage utility uses a number of NetApp NAS devices.

The situation is similar at the San Diego Supercomputer Center, where a spokeswoman said virtualization technology is interesting, but the center is not using it.

Jeff White, a technical specialist at CDW Government, agreed that customers aren’t yet clamoring for NAS virtualization. “We haven’t seen a whole lot of demand for it,” he said. “At the end of the day, it’s almost kind of a luxury product.”

In an era of budget cuts, storage spending focuses instead on items such as primary storage, disaster recovery and security, White said.

Nevertheless, TheInfoPro reports that interest in NAS virtualization among enterprise information technology buyers is growing quickly and emphasizes that managers are starting to feel some pain from trying to keep pace with the growing need for storage.

The case for virtualization
Storage consultants and vendors suggest several reasons for using NAS virtualization products.

The technology works by creating a layer that masks the physical location of data. Client devices and servers are no longer mapped to specific physical storage devices.

Instead, the virtualization appliance maps the physical location of data to a logical address — the one a user employs to access a file. Administrators can create policies to govern the management of data in the virtualized environment.

“The premise of virtualization is breaking the physical binding between the front end and the back end,” said Brendon Howe, senior director and general manager of NetApp’s V-Series business unit.

With virtualization, users or programs can tap a drive on a network to access files “without knowing physically where the drive resides or how many [file storage devices] it is on,” said Kirby Wadsworth, senior vice president of marketing and business development at Acopia.

Storage administrators also benefit. By creating a single storage pool, virtualization harmonizes heterogeneous NAS environments. White called this management boon the biggest benefit of NAS virtualization.

“Instead of having 40 different file servers…this gives you one management console for all of them,” he said. Virtualization, he added, creates one homogenous file system out of the multiple
file systems found in different NAS boxes.

To accomplish this, some vendors offer global namespace management, a feature that provides a single view of file systems spanning multiple or mixed NAS devices.

Jack Norris, EMC’s vice president of marketing for Rainfinity, called global namespace “one of the key building blocks of file virtualization.” He said a namespace functions in much the same way as the Internet’s Domain Name System, which converts domains into IP addresses.

Similarly, “the function of a namespace is to provide an abstraction layer so that end users are not tied to a physical address but are accessing a logical name,” Norris said.

The ability of virtualization to harmonize multivendor NAS settings eases management and opens purchasing options. Norris said virtualization lets customers buy hardware with the best price and performance rather than sticking with the same brand of equipment.

Specific benefits
Howe said many of the issues NAS virtualization seeks to address are identical to those that users experience with storage-area networks, the specialized networks that connect servers to storage devices for the exchange of low-level pieces of data called blocks. SAN virtualization products have been available for a few years and operate on a basis similar to NAS virtualization.

In the case of migration, virtualization lets organizations take files off an old NAS box and move them to a new machine without disrupting users, Wadsworth said. Migration occurs in the background, because users remain attached to the virtual presence of the file during the process, he added.

Acopia’s virtualization technology copies the file contents from the old storage device to the new device. When the copy is complete, the new physical location of the file is updated in the appliance’s virtual-to-physical mapping tables, but the virtual address remains the same, Wadsworth said. If users accessed their files on their G drive, they continue to do so.

“The administrator is free to relocate data at any time without having to worry about the impact on end users,” Norris said. He said some virtualization customers have reported performing migrations in one-tenth the time they would normally take.

“The adoption curve is really starting to pick up,” Norris said. He cited an example of a government agency using Rainfinity to migrate more than 33,800 file systems.

“But it’s not at the point where everyone has completed a file virtualization deployment,” Norris added.

Wadsworth agreed that NAS virtualization deals are happening. But as for the scale of adoption, “I wouldn’t say it was widespread,” he said.

Norris said he expects the market to grow rapidly in the next year.

White said he thinks the demand for NAS virtualization will eventually materialize.

“I think it’s a great idea, but demand isn’t there for it yet,” he said.

Virtual convergence?

The worlds of network-attached storage virtualization and storage-area network virtualization will converge in the next few years, some industry executives say.

The integration of NAS and SAN gear is already under way. Single gateways that allow users and applications to access file-level NAS and block-level SANs have been available from several vendors for a few years.

Brendon Howe, senior director and general manager of Network Appliance’s V-Series business unit, said he believes a similar merger of separate NAS and SAN virtualization products will also occur.

“NAS virtualization, when considered by itself, is a technology…that doesn’t have a unique set of customer problems versus SAN virtualization,” Howe said. He said the argument for having distinct, stand-alone NAS and SAN virtualization platforms will diminish in time.

Ashish Nadkarni, principal consultant at storage consultant GlassHouse Technologies, said creating a unified virtualization product is doable. But he added that vendors will have to consider performance issues and whether customers will want to put all their virtualization eggs in one basket.

Market prospects

Network-attached storage virtualization may not be red hot in the government, but storage market watchers suggest the overall market is gaining momentum.

TheInfoPro, a New York-based market research firm, reported that adoption has doubled in recent months. A survey of Fortune 1,000 storage managers conducted in fall 2005 pegged the file virtualization base at 7.5 percent. By spring 2006, 14.2 percent of the storage professionals polled said they had file virtualization in use.

TheInfoPro’s heat index, which tracks spending commitments, ranks file virtualization as sixth out of the 16 technologies the company monitors. File virtualization ranked toward the bottom of earlier assessments.

File virtualization, meanwhile, stands at 16 out of 20 in TheInfoPro’s adoption index. Robert Stevenson, a managing director at TheInfoPro, said a technology that ranks high on the heat index and low on the adoption index “shows a lot of room for growth.”

Stevenson said his firm is seeing considerable interest in file virtualization. “Clearly, people have a lot of data mobility challenges, and file virtualization helps with that,” he said.

Kirby Wadsworth, senior vice president of marketing and business development at Acopia Networks, identified large NAS deployments as a virtualization sweet spot. In organizations with hundreds of users who access a home directory, administrators can change the storage environment without taking all the users off-line.

Wadsworth also cited performance-intensive applications that involve accessing large files, such as satellite image analysis.

Anand Iyengar, founder and chief technology officer of NeoPath Networks, said virtualization is also valuable to organizations that are upgrading to new storage platforms. He said the company’s File Director product provides for a nondisruptive migration path. Iyengar said an application seeking to access files won’t notice that the data has moved from one storage server to another.

The Census Bureau hasn't established a time frame for its cloud computing plans, including testing for scalability, security, and privacy protection, as well as determining a budget for cloud services.