Data Compression

The term data compression refers to decreasing the number of bits of info which has to be saved or transmitted. This can be achieved with or without losing data, so what will be deleted during the compression shall be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the info and its quality shall be the same, whereas in the second case the quality will be worse. There are different compression algorithms which are better for different type of data. Compressing and uncompressing data generally takes plenty of processing time, therefore the server executing the action must have ample resources in order to be able to process your data quick enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 within the binary code as an alternative to storing the particular 1s and 0s.

The ZFS file system which is run on our cloud Internet hosting platform uses a compression algorithm called LZ4. The aforementioned is substantially faster and better than every other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the overall performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data really well and it does that quickly, we're able to generate several backup copies of all the content stored in the cloud hosting accounts on our servers on a daily basis. Both your content and its backups will need reduced space and since both ZFS and LZ4 work very fast, the backup generation will not affect the performance of the web servers where your content will be stored.