World of Compression



Compression methods

ZCM archiver

PACKET archiver

LZA archiver

FLASHZIP archiver

RINGS archiver

Data compression benchmark's



by NANIA Francesco Antonio

Hello everyone, this site was created to bring forward in 2006 a number of my projects or programs (say you will) for data compression. I started in 2006 with the programs published on the website of the myth of a data compression Matt Mahoney , who always thank you for your help and support. Today, data compression is a lot, we see it every day in almost all of the programs that we use (data, video, images, etc.). Do not forget to latest innovations that arithmetic coding has resulted in the Home Video and the Home Cinema with the H264 encoding (Cabac). There were also many strides in recent years, unfortunately, the rate of price compression (CM methods, PPM etc..). With my research I try to reach and exceed the theoretical limits in data compression. I invite you to go to the different sections of the program by taking the free downloads. I emphasize, however, that the programs are in demo mode and can not be used except for personal use.


In computer science and information theorydata compressionsource coding,[1] or bit-rate reduction involves encoding information using fewer bits than the original representation.[2] Compression can be either lossy or losslessLossless compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by identifying unnecessary information and removing it.[3] The process of reducing the size of a data file is popularly referred to as data compression, although its formal name is source coding (coding done at the source of the data before it is stored or transmitted).[4] Compression is useful because it helps reduce resource usage, such as data storage space or transmission capacity. Because compressed data must be decompressed to use, this extra processing imposes computational or other costs through decompression; this situation is far from being a free lunch. Data compression is subject to a space–time complexity trade-off. For instance, a compression scheme for video may require expensive hardware for the video to be decompressed fast enough to be viewed as it is being decompressed, and the option to decompress the video in full before watching it may be inconvenient or require additional storage. The design of data compression schemes involves trade-offs among various factors, including the degree of compression, the amount of distortion introduced (e.g., when using lossy data compression), and the computational resources required to compress and uncompress the data.[5] New alternatives to traditional systems (which sample at full resolution, then compress) provide efficient resource usage based on principles of compressed sensing. Compressed sensing techniques circumvent the need for data compression by sampling off on a cleverly selected basis.( From Wikipedia, the free encyclopedia)








Copyright (C) 2016 Nania Francesco .