Data compression is the compacting of data by decreasing the number of bits that are stored or transmitted. As a result, the compressed data takes considerably less disk space than the original one, so much more content could be stored using the same amount of space. You will find different compression algorithms which function in different ways and with some of them only the redundant bits are removed, therefore once the data is uncompressed, there's no loss of quality. Others remove unneeded bits, but uncompressing the data following that will lead to lower quality in comparison with the original. Compressing and uncompressing content needs a large amount of system resources, and in particular CPU processing time, so every hosting platform that uses compression in real time must have adequate power to support this attribute. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of storing the whole code.