A new generation of flexible storage technologies is helping organizations get a handle on their ever-growing data flows and put that data in the hands of users.

Early in 2012, the company started looking into its options, and quickly focused its sights on cloud storage provider Box. By March, it was testing the service with four proof-of-concept projects in which large files were being shared among designers and construction crews.

The tests were so successful that the company signed up for a 50-user business license, which grew to 180 users in three months. That enthusiastic adoption spurred the company to negotiate a 450-user enterprise agreement, which also enables the firm to provide unlimited access to files for external users, such as contractors and client owners.

Initially, the use of Box storage was going to be limited to data related to current projects, but that strategy was scratched in the face of user backlash.

"We were going to save our older projects in our data center and not go through the process of migrating them," says Babcock. "But we had requests from the operational groups saying, 'No, we want that data.'"

As a result, Babcock and her team are steadily migrating about 4 terabytes (TB) of data from that aging document management system. Meanwhile, independent of that migration, her last check of the Box dashboard indicated that 500 Mb of data are being added each day.

Babcock says it's hard to know what other types of data Weitz might one day opt to store in the cloud, but one thing's certain: "Any system we're looking to replace now," she says, "we're looking into whether it integrates with Box."

When that time comes, using cloud storage will enable Weitz to provision—or de-provision— storage resources in seconds, as business needs change.

Software-Defined Storage

Washington Trust Bank's plans to roll a virtual desktop infrastructure deployment into a migration to Windows 7 had a practical and seemingly sound component: The bank would support the VDI with its existing physical storage.

With 5TB of physical storage allocated to nearly 450 knowledge workers targeted for virtual desktops, the Spokane, Wash.-based regional bank had plenty of room to accommodate data. But what Washington Trust quickly learned was that its storage setup lacked the required input/output capabilities—a shortcoming that would require an additional $100,000 in storage hardware to address, says Chris Green, vice president of IT infrastructure systems. Adding costs of more than $200 per user to a project with a total budget of $600 to $700 per user wasn't feasible.

"I thought it might kill the project," Green recalls.

Online research of the issue led Green to Atlantis Computing software, which optimizes storage hardware to support virtual environments. When tests of the company's software showed huge reductions in the volume of processing loads running through the bank's new virtual environment, Washington Trust negotiated a 450-user license and moved into production.

The bank immediately registered an 80 percent reduction in input/output processes per second. Subsequently, there was a 50 percent reduction in the drain on the bank's storage processors.

As a result, Washington Trust how has a VDI that's running faster and more efficiently, and it also has been able to free up storage resources for other systems. All of this cost the bank just $75 per user, or about one-third of what it might have spent on additional storage hardware.

"Not spending that money either increases your profitability or allows you to take advantage of other opportunities," Green points out. "Our engineers are freed up to work with our business units to move the needle, rather than having to manage all those desktops."

Big Data-Optimized Storage

The onrush of the big data era has forced many data-rich organizations to optimize their storage environments to contend with the volume, velocity and variety of data coming at them. Then there are others, such as the University of North Texas, that unwittingly readied themselves for big data even before the term was coined.

Back in early 2007, the huge Dallas-Fort Worth school moved from Novell GroupWise to a Microsoft architecture running Exchange, and deployed PeopleSoft with new Oracle databases on the back end. These moves more than doubled the school's storage requirements from 14TB to nearly 30TB, and accommodating that increase (which was sizeable at the time) was going to require a huge investment in additional storage hardware, says Monty Slayton, the university's IT manager.

And it wasn't just space the school needed. Its storage systems at the time relied on a single class of high-end drives designed to store frequently accessed data. It had no data progression capabilities that would allow older data to be stored on less expensive hardware.

"There was no thought given to the way data aged," Slayton recalls.

The school decided to invest in a storage platform that could grow modularly to accommodate the anticipated mushrooming of data. It chose to go with arrays from Compellent Technologies (since acquired by Dell), which also offered automated tiering of data.

The timing of the decision was fortuitous, as the portended exponential growth of data arrived as predicted. Steady growth in enrollment pushed the school's student population to nearly 36,000 last fall.

In addition, regulations progressively lengthened the time during which schools are required to retain student and faculty data. Finally, new kinds of unstructured data, such as social media feeds, placed additional demands on storage resources. As if that wasn't enough of a challenge, the school opened a new South Dallas campus, which created additional data storage needs.

But what really tested the school's storage environment was the massive growth in the use of video. Whether it's the increasingly fast-growing collection of HD video of athletic practices and games, or the larger roll video plays in the school's marketing campaigns, video has steadily pushed each department's storage requirements upward.

"The size of the files and the density of things have increased exponentially," says Slayton. "It's rare that anyone asks for less than 1TB of data."

The growing assortment –and volume—of data has tested the university's Compellent arrays. In the few years since it was deployed, the school has upgraded to faster and more powerful controllers, speedier fibre-channel storage networks, and a larger number and wider array of disks.

Today, the school manages nearly 2 petabytes of tiered storage. Some 80 percent of that is third-tier long-term storage, while just 5 percent consists of first-tier high-availability disks.

While the university's data needs at the moment might not reflect every definition of big data, they're substantial enough to be termed "explosive, expansive data growth" in the school's IT circles. "To me," says Slayton, "they're kind of the same thing."

Tony has been writing about the intersection of technology and business for nearly 20 years and currently freelances from the Albany, Calif., home where he and his wife are raising three boys. A 1988 graduate of the University of Missouri-Columbia School of Journalism and regular contributor to Baseline since 2007, Tony's somewhat infrequent Twitter posts can be found at http://twitter.com/tkontzer.