The term data compression identifies reducing the number of bits of data that has to be stored or transmitted. You can do this with or without losing information, which means that what will be deleted throughout the compression will be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the content and its quality will be identical, while in the second case the quality will be worse. There are various compression algorithms that are better for various kind of info. Compressing and uncompressing data normally takes plenty of processing time, therefore the server performing the action must have adequate resources in order to be able to process your info fast enough. A simple example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 inside the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Shared Web Hosting

The ZFS file system that operates on our cloud Internet hosting platform uses a compression algorithm named LZ4. The aforementioned is a lot faster and better than every other algorithm you'll find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data very well and it does that very fast, we can generate several backup copies of all the content kept in the shared web hosting accounts on our servers daily. Both your content and its backups will require reduced space and since both ZFS and LZ4 work very fast, the backup generation will not influence the performance of the web servers where your content will be stored.

Data Compression in Semi-dedicated Servers

The ZFS file system that runs on the cloud platform where your semi-dedicated server account will be created uses a powerful compression algorithm called LZ4. It's among the best algorithms out there and definitely the most efficient one when it comes to compressing and uncompressing web content, as its ratio is very high and it can uncompress data much faster than the same data can be read from a hard drive if it were uncompressed. This way, using LZ4 will quicken every website that runs on a platform where this algorithm is enabled. The high performance requires lots of CPU processing time, that is provided by the numerous clusters working together as a part of our platform. Furthermore, LZ4 makes it possible for us to generate several backup copies of your content every day and keep them for one month as they will take a reduced amount of space than standard backups and will be created considerably faster without loading the servers.