The Huge Opex Win - Treat Inactive Data like Inactive Data

For the past ten years, NTP Software has worked closely with NetApp to enhance the control and management of unstructured data across the entire NetApp file data storage environment. We have worked with NetApp storage and data management solutions longer, perhaps, than most people reading this. As part of NetApp’s Preferred Alliance Partner Program, together we have provided comprehensive file data management for the world’s largest organizations, including Fortune 1000s and large government agencies.

Few of these organizations are unaware of the challenges posed by ever-increasing amounts of unstructured data. Primary storage data growth extends backup cycles, consumes resources, requires unnecessary management overhead, and can end up costing millions of dollars for extensions and upgrades.

While there are certainly regulatory and corporate policies in place requiring organizations to keep the growing volumes of data their companies produce, the reality is that less than ten percent of this file data is ever accessed by the end user again after it is created. What’s more, in typical environments, the volume of active files – the files people create and use to do their jobs day in and day out – remains more or less fixed. Inactive data is responsible for data growth, not active data.

Infrequently accessed data generates its own challenges. If you’ve been in business for a few years or more, it’s safe to assume two-thirds of your files are inactive. What’s more, once saved to the network, 80 percent of your files will never even be looked at again. This greatly incentivizes scrapping the typical file backup strategy in favor of tiering, based on an online archive for inactive files.

Inactive files have different needs than active ones. Inactive files just want to rest. What most people don’t realize is that a file in motion is a dangerous file: it must be kept track of and its changes noted and saved. Active files are being touched by someone, and all of that needs to be tracked - when a file was last changed, who changed it, if it was copied, if it was deleted…

Because an inactive file doesn’t have these needs, an inactive file has comparatively no Opex cost. Compare to active files, where it’s not unusual for an organization performing a standard weekly full backup, plus nightly incremental backups, to have 15, 20, even 25 copies of file data at any time. One of our customers, a multinational banking and financial services firm, was maintaining 28 copies of each file until we came in to provide relief. All of which is Opex overhead.

The key is controlling and managing how and when files are moved – in other words, determining when a file becomes inactive.

Online archiving solutions that do this tend to burden networks with repetitive file system scans. The cost, time, and complexity involved in scanning the entire file system and moving data to an archive can be onerous.

We have developed approaches that automatically move inactive files out of the backup cycle and into a secure storage environment. There, they become files at rest: safe, categorized, properly managed, and moved to the right tier. This greatly reduces Opex costs, addresses compliance issues, and meets electronic discovery requirements.

With this method, it’s not unusual to see a two-thirds reduction in the cost of storing and protecting inactive files in the first year alone, and the savings accumulate as the inactive file load grows. One manufacturing customer calculated their first-year savings at 82 percent.

For more information about how NTP Software partners with NetApp to provide comprehensive and seamless file data management, visit http://www.ntpsoftware.com/ to check out blog posts, read whitepapers, and register for a free File Data Assessment.

Additionally, Follow NetApp on Twitter, including conversations around the company’s Financial Services Forum, June 25th, 2014 in New York, via #NetAppFinServ