Data Compression

The term data compression identifies decreasing the number of bits of info that has to be saved or transmitted. You can do this with or without the loss of information, so what will be erased throughout the compression shall be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the info and its quality shall be the same, whereas in the second case the quality will be worse. There are various compression algorithms which are more efficient for different type of data. Compressing and uncompressing data usually takes a lot of processing time, so the server executing the action needs to have plenty of resources to be able to process your data quick enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 in the binary code as an alternative to storing the particular 1s and 0s.

The ZFS file system that operates on our cloud Internet hosting platform uses a compression algorithm identified as LZ4. The aforementioned is a lot faster and better than every other algorithm available on the market, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the performance of sites hosted on ZFS-based platforms. As the algorithm compresses data very well and it does that quickly, we're able to generate several backup copies of all the content stored in the cloud web hosting accounts on our servers on a daily basis. Both your content and its backups will need reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the hosting servers where your content will be stored.