SAN FRANCISCO — As more content is being migrated to the cloud and more mobile devices are connecting to the Internet, the rate at which data is accumulating is growing incredibly faster than the amount of storage space being developed to contain it.

During a panel discussion at IBM’s offices in San Francisco on Wednesday morning, Steve Wojtowecz, vice president of storage software development at IBM, stated that there are going to be over a trillion devices connected to the internet by 2015.

According to a survey of 255 IT professionals that polling agency Zogby International interviewed on behalf of IBM about storage spending priorities and organizational needs, nearly half of the respondents (43 percent) admitted that they are concerned about managing big data.

Here’s the breakdown on how those respondents plan to address their problems:

48 percent plan to increase storage investments in virtualization

26 percent plan to increase investments in the cloud

24 percent plan to increase investments in flash memory and solid state technology

One way that IBM plans on meeting storage demands is going an industry-specific route. So far, IBM has outlined projects that focus primarily on healthcare and cosmology, and here are summaries about two of them:

A medical archiving solution in partnership with TeraMedica: Powered by IBM systems, both the storage and software will give patients and caregivers instant access to critical medical data at the point-of-care. The system can manage up to 10 million medical images while helping health care practitioners to provide better care with reduced costs.

Storage system in collaboration with Institute for Computational Cosmology at Durham University in the U.K.: Inspired by the technology used in the IBM Watson system, the enhanced storage system will enable up to 50 researchers to work collaboratively and simultaneously. The system will hold and enable users to manage up to one petabyte of data on two projects: galaxy formation and the fate of gas outside of galaxies.

Of course, there is are many more financial and sustainability motives as well. For example, at IBM’s data centers in Boulder, Colo., IBM has reduced block storage facilities by 50 percent, re-diverting that money to the building out the cloud and creating other business opportunities.

At its Rochester, Minn. campus, IBM has installed approximately 240,000 sensors throughout the building on everything from air ducts to toilets. Data is collected from 15 percent of the sensors every 15 minutes, which equates to roughly 20 million data points in a week. Not only does that provide a good deal of information about the facilities that can be used to manage the campus more efficiently, but it also racks up a lot of data that needs to be stored somewhere.

“We need to understand how to use that data in a way that is beneficial to IBM,” Wojtowecz said, “To me this is the start of a smart building.”

Although virtually every industry is going to need to sort out a plan to meet growing storage demands, the entertainment field is one that is particularly eye-catching.

Peter Ward, former senior vice president of information operations and content licensing at Sony Pictures, explained during the panel discussion that shooting movies in digital, and even more so in 3D these days, requires storage to be factored into the budget. Many film sets, Ward said, now have data centers in trailers, some of which are transferring footage back to Hollywood studios at the same time.

To get a better idea of just how much storage a single average Hollywood film requires, Ward pointed out that a film, including cutting room floor content, can take up a petabyte of space. However, the average finished project is about 10 to 20 terabytes.

Although major studios, and likely many global corporations, have already or are now figuring out their storage needs, it still remains a problem for the vast majority of businesses.

“There are lots of tools out there,” Andrew Reichman, a principal analyst at Forrester Research, affirmed, but that “what’s lacking is where to apply the tools.”

Acknowledging that even many Fortune 500 companies are suffering, Reichman explained that many businesses are following their original approaches — which was largely to do the same thing and just add more storage somewhere. But that doesn’t solve the problem as data accumulates at exponential rates and older storage systems aren’t adaptable to scaling.

There isn’t one right answer, Reichman said, but he added that analytics and the infrastructure towards enabling these changes need to improve faster.

Thank You

By registering you become a member of the CBS Interactive family of sites and you have read and agree to the Terms of Use, Privacy Policy and Video Services Policy. You agree to receive updates, alerts and promotions from CBS and that CBS may share information about you with our marketing partners so that they may contact you by email or otherwise about their products or services.
You will also receive a complimentary subscription to the ZDNet's Tech Update Today and ZDNet Announcement newsletters. You may unsubscribe from these newsletters at any time.