Data Sanity Practices

It is a popular adage that companies that focus on storing their customers’ data are sitting on gold mines. The business world is growing with a fast rate, and with the advent of data analysis practices and tools even the strategic and reporting works of an organization has seen immense change in the past two decades across the world. Companies, across the globe, invest heavily on data preservation software in an attempt to harness the maximum potential of their data. However, it is not only the quantity of data that matters but the quality that plays big role. Based on our experience of working with tens of companies across all major geographies, we present some insights into the sanity of Data storage practices that saves a lot of work in the analysis stage:

MS Excel is the easiest and still the most popular tool to store data. The ability to be understood by many, and easy-to-use features (e.g. Pivots) make it the most likeable tool in the Data Industry.

Most of the standard analysis tools have built-in concepts to easily accept CSV files as input.

When storing customers' level information, using identifiers helps very much in the longer run by saving data cleaning efforts.

Keeping a backup of data files (either on DVDs or cloud) every 6 months helps very much to preserve the valuable information.

There comes a stage in the business when the data becomes high in volume, switching over to one of RDBMS systems (Oracle, SAP, others) then is a good idea as these systems have built-in mechanisms to check data quality & build reports.

But with the advantages come the heavy price to manage and build analyses. One of our customers, a million dollar enterprise, still excellently manages data without using ERP software while there are industries that cannot imagine working on plain text files. It is best to judge what is required in the business and use data systems that suit your need the most.