The term data compression refers to decreasing the number of bits of data which needs to be stored or transmitted. This can be done with or without losing data, so what will be deleted in the course of the compression shall be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the info and the quality shall be the same, whereas in the second case the quality will be worse. You will find different compression algorithms which are better for various sort of info. Compressing and uncompressing data frequently takes plenty of processing time, therefore the server executing the action must have adequate resources in order to be able to process the info fast enough. A simple example how information can be compressed is to store how many consecutive positions should have 1 and just how many should have 0 in the binary code instead of storing the particular 1s and 0s.

Data Compression in Shared Website Hosting

The ZFS file system which runs on our cloud Internet hosting platform uses a compression algorithm named LZ4. The latter is a lot faster and better than every other algorithm you can find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data quite well and it does that very quickly, we're able to generate several backups of all the content kept in the shared website hosting accounts on our servers daily. Both your content and its backups will require reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the servers where your content will be kept.