Case Study: Polysius: Concrete results with NetApp's de-dupe software

NetApp proves its de-dupe credentials with a 47 percent saving in data stored by a customer, and that too, on front line data.

By
ris Mellor, Techworld
| Oct 04, 2007

Share

TwitterFacebookLinkedInGoogle Plus

Polysius, a designer and installer of cement plants and equipment, has reduced its overall data by 47 percent since using Network Appliance's data de-duplication facility, A-SIS. That's how much redundancy has been detected and stripped out by A-SIS.

Network Appliance added its advanced single-instance storage (A-SIS) to its storage line-up earlier this year. It looks for related elements in files and replaces them with a pointer, making the space the redundant elements used, available for fresh data.

Polysius has been in the USA for 25 years and is an associated company of Polysius AG in Neubeckum, Germany, which has a 140 year history of involvement with cement manufacturing plants.

Polysius's manager of IT, James Krochmal, needed to dramatically maximise existing resources - money, management, and time - while simplifying the management of the Polysius storage infrastructure. He was aware that much data was copied several times, saying: "Polysius was duplicating data like crazy as a result of the fact that we consistently replicate all of our CIFS file shares. These countless copies of data files were consuming vast amounts of our storage. NetApp de-duplication technology enabled us to more efficiently store our data and de-duplicate all of our primary user data with absolutely no performance hit."

Earlier this year, in May, NetApp became the first major storage vendor to offer customers the cost benefits of de-duplication across a wide variety of environments, including backup, archival, and primary data sets and applications as diverse as home directories and genomic data.

The technology provides customers with the ability to reduce capital expenditures and management costs by significantly reducing the amount of storage they need to purchase and manage. Through eliminating duplicate data and thus freeing up storage capacity, enterprises can also reduce the need to add more physical storage and lower the space, cooling, and power needed in their datacentre.

Krochmal confirmed this: "The reduction in the quantity of our physical storage translated into tremendous cost savings in our datacentre. NetApp de-duplication technology freed up budget that was going to be spent on more spindles that will now be spent on new technologies that will move our business - and our customers' businesses - forward."

It has been supposed that de-duplication was only suitable for back-up data in disk-based backup arrays or virtual tape libraries. There were two reasons for this. Firstly, repeated backups generally contain much repeated information. By stripping that out and storing unique data patterns or elements only once tremendous savings in storage could be made.

Secondly, de-duplication takes up processing power and is generally carried out after a backup to disk has taken place so as not to lengthen the backup window. This user story indicates that de-duplication can be applied to data in tier-one drive arrays and not affect general system responsiveness.

NetApp was fairly late in bringing de-duplication products to the attention of its customers and hasn't made the seemingly exaggerated claims of de-dupe ratios 20, 30, even 50 or more that other vendors have promulgated. The idea that a 20TB drive array could actually hold 300TB of data with a 30:1 de-dupe ratio seems far-fetched.

Yet here Polysius has freed up almost half of its disk storage estate through using de-duplication on its primary storage, placing NetApp in the forefront of companies with proven de-duplication technology.