Lossless data compression benchmark

Contents:

[filled in script]


Algorithms

Column "Alg" shows the dominant algorithm used in a compression program. Compression algorithms are explained in depth in an excellent online book Data Compression Explained so I won't repeat that here. I will only list some algorithms names and their abbreviations: Please note that compression programs often combine multiple compression algorithms, filtering, transforms, etc so the dominant algorithm (or algorithms) is a subjective category and can also be a subject to interpretation and/ or discussion. Some algorithm families (like Lempel-Ziv) have multiple variants so concrete algorithm can be either named after the popular algorithm family it resembles or named with it's unique name.

Test sets:


Compression programs


Rules:


Questions & answers

  1. Q: How to measure speed and memory usage on Microsoft Windows?

    A: Look for latest version of ProcProfile here: encode.ru: Command Line Process Profiling Tool

  2. Q: How to measure speed and memory usage on GNU/ Linux?

    A: Invoke /usr/bin/time -v <original command>. Time to be reported will be after Elapsed (wall clock) time and memory usage to be reported will be after Maximum resident set size (kbytes) (needs to be scaled to get usage in megabytes).