The term data compression describes reducing the number of bits of data that should be stored or transmitted. This can be done with or without losing data, which means that what will be removed in the course of the compression shall be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the information and the quality shall be identical, whereas in the second case the quality will be worse. There are different compression algorithms which are more efficient for different kind of info. Compressing and uncompressing data usually takes a lot of processing time, so the server performing the action needs to have enough resources to be able to process the data fast enough. An example how information can be compressed is to store just how many sequential positions should have 1 and how many should have 0 in the binary code rather than storing the particular 1s and 0s.
Data Compression in Shared Hosting
The ZFS file system which is run on our cloud web hosting platform uses a compression algorithm called LZ4. The latter is significantly faster and better than every other algorithm out there, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the performance of sites hosted on ZFS-based platforms. As the algorithm compresses data quite well and it does that very quickly, we can generate several backups of all the content kept in the shared hosting accounts on our servers every day. Both your content and its backups will need less space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the web servers where your content will be stored.