The term data compression describes reducing the number of bits of info which should be saved or transmitted. This can be achieved with or without the loss of data, so what will be erased during the compression can be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the information and its quality shall be identical, while in the second case the quality will be worse. There are different compression algorithms that are more effective for various type of data. Compressing and uncompressing data frequently takes lots of processing time, therefore the server performing the action should have sufficient resources to be able to process your data fast enough. One simple example how information can be compressed is to store how many consecutive positions should have 1 and just how many should have 0 within the binary code instead of storing the actual 1s and 0s.

Data Compression in Cloud Hosting

The compression algorithm used by the ZFS file system that runs on our cloud hosting platform is called LZ4. It can supercharge the performance of any site hosted in a cloud hosting account on our end as not only does it compress info more effectively than algorithms used by various file systems, but it uncompresses data at speeds that are higher than the hard disk drive reading speeds. This is achieved by using a lot of CPU processing time, which is not a problem for our platform since it uses clusters of powerful servers working together. A further advantage of LZ4 is that it allows us to create backups more speedily and on less disk space, so we can have multiple daily backups of your files and databases and their generation will not influence the performance of the servers. That way, we could always restore any kind of content that you may have deleted by mistake.