Data Compression

Data compression is the compacting of info by decreasing the number of bits that are stored or transmitted. In this way, the compressed info will need less disk space than the initial one, so a lot more content can be stored using the same amount of space. You will find different compression algorithms which function in different ways and with a number of them just the redundant bits are removed, so once the data is uncompressed, there's no loss of quality. Others delete unneeded bits, but uncompressing the data subsequently will result in reduced quality in comparison with the original. Compressing and uncompressing content consumes a large amount of system resources, particularly CPU processing time, therefore each and every web hosting platform which employs compression in real time should have enough power to support that attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of saving the entire code.

The compression algorithm used by the ZFS file system which runs on our cloud hosting platform is known as LZ4. It can improve the performance of any website hosted in a shared website hosting account on our end as not only does it compress data more efficiently than algorithms used by various other file systems, but it also uncompresses data at speeds which are higher than the hard disk reading speeds. This is achieved by using a great deal of CPU processing time, that is not a problem for our platform due to the fact that it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it allows us to create backups a lot faster and on lower disk space, so we will have a couple of daily backups of your databases and files and their generation will not change the performance of the servers. In this way, we can always restore any content that you could have removed by mistake.