Data compression is the compacting of information by reducing the number of bits that are stored or transmitted. As a result, the compressed info takes less disk space than the original one, so extra content might be stored on identical amount of space. You'll find different compression algorithms which function in different ways and with some of them just the redundant bits are erased, so once the data is uncompressed, there is no loss of quality. Others delete excessive bits, but uncompressing the data following that will result in lower quality compared to the original. Compressing and uncompressing content needs a large amount of system resources, in particular CPU processing time, therefore every Internet hosting platform which uses compression in real time should have sufficient power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of saving the actual code.

Data Compression in Shared Hosting

The compression algorithm used by the ZFS file system which runs on our cloud web hosting platform is known as LZ4. It can enhance the performance of any site hosted in a shared hosting account on our end since not only does it compress info much better than algorithms employed by alternative file systems, but it uncompresses data at speeds that are higher than the hard disk reading speeds. This is achieved by using a great deal of CPU processing time, which is not a problem for our platform since it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it enables us to make backups quicker and on lower disk space, so we can have multiple daily backups of your databases and files and their generation will not affect the performance of the servers. In this way, we can always restore all the content that you could have deleted by accident.