Data Compression

Data compression is the compacting of information by decreasing the number of bits which are stored or transmitted. Thus, the compressed info will require less disk space than the original one, so additional content could be stored on the same amount of space. There are various compression algorithms that work in different ways and with a number of them just the redundant bits are removed, which means that once the data is uncompressed, there is no decrease in quality. Others remove unneeded bits, but uncompressing the data later will result in lower quality in comparison with the original. Compressing and uncompressing content consumes a huge amount of system resources, particularly CPU processing time, so any web hosting platform that employs compression in real time should have enough power to support this feature. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of keeping the entire code.

Data Compression in Cloud Web Hosting

The ZFS file system that runs on our cloud hosting platform employs a compression algorithm identified as LZ4. The latter is considerably faster and better than any other algorithm you can find, especially for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that quickly, we can generate several backups of all the content stored in the cloud web hosting accounts on our servers every day. Both your content and its backups will take reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the web hosting servers where your content will be kept.

Data Compression in Semi-dedicated Hosting

The semi-dedicated hosting plans which we offer are created on a powerful cloud platform that runs on the ZFS file system. ZFS uses a compression algorithm called LZ4 that is greater than any other algorithm available on the market in terms of speed and compression ratio when it comes to processing web content. This is valid especially when data is uncompressed since LZ4 does that a lot faster than it would be to read uncompressed data from a hard drive and as a result, sites running on a platform where LZ4 is present will function quicker. We can take advantage of the feature although it needs quite a lot of CPU processing time as our platform uses a lot of powerful servers working together and we do not create accounts on a single machine like most companies do. There is another reward of using LZ4 - since it compresses data very well and does that extremely fast, we can also make multiple daily backup copies of all accounts without affecting the performance of the servers and keep them for a whole month. This way, you will always be able to recover any content that you delete by mistake.