Data Compression

Data compression is the compacting of data by reducing the number of bits that are stored or transmitted. This way, the compressed info will need considerably less disk space than the original one, so a lot more content could be stored using identical amount of space. You can find different compression algorithms that function in different ways and with many of them just the redundant bits are removed, which means that once the information is uncompressed, there's no decrease in quality. Others remove excessive bits, but uncompressing the data later will lead to lower quality in comparison with the original. Compressing and uncompressing content consumes a significant amount of system resources, in particular CPU processing time, so any web hosting platform which employs compression in real time must have enough power to support this feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of saving the actual code.

Data Compression in Cloud Hosting

The compression algorithm employed by the ZFS file system that runs on our cloud hosting platform is known as LZ4. It can enhance the performance of any site hosted in a cloud hosting account with us as not only does it compress info more efficiently than algorithms employed by other file systems, but it uncompresses data at speeds which are higher than the HDD reading speeds. This is achieved by using a lot of CPU processing time, that is not a problem for our platform since it uses clusters of powerful servers working together. One more advantage of LZ4 is that it enables us to create backup copies faster and on reduced disk space, so we can have a couple of daily backups of your files and databases and their generation won't influence the performance of the servers. This way, we could always restore all the content that you could have removed by accident.