Data compression is the compacting of data by reducing the number of bits which are stored or transmitted. Consequently, the compressed data will need less disk space than the original one, so much more content can be stored using identical amount of space. There're different compression algorithms which work in different ways and with a number of them just the redundant bits are erased, therefore once the data is uncompressed, there is no loss of quality. Others delete excessive bits, but uncompressing the data later will lead to lower quality in comparison with the original. Compressing and uncompressing content requires a large amount of system resources, and in particular CPU processing time, therefore each and every hosting platform which uses compression in real time needs to have enough power to support this feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of saving the entire code.

Data Compression in Shared Website Hosting

The compression algorithm employed by the ZFS file system that runs on our cloud web hosting platform is called LZ4. It can boost the performance of any website hosted in a shared website hosting account with us since not only does it compress data much better than algorithms employed by other file systems, but also uncompresses data at speeds which are higher than the hard disk drive reading speeds. This can be done by using a lot of CPU processing time, that is not a problem for our platform because it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it enables us to create backup copies faster and on less disk space, so we will have a couple of daily backups of your files and databases and their generation won't influence the performance of the servers. That way, we can always restore the content that you may have deleted by mistake.