Huffman coding and
arithmetic coding (when they can be used) give at least as good, and often better compression than any universal code. However, universal codes are useful when Huffman coding cannot be used — for example, when one does not know the exact probability of each message, but only knows the rankings of their probabilities. Universal codes are also useful when Huffman codes are inconvenient. For example, when the transmitter but not the receiver knows the probabilities of the messages, Huffman coding requires an overhead of transmitting those probabilities to the receiver. Using a universal code does not have that overhead. Each universal code, like each other self-delimiting (prefix) binary code, has its own "implied probability distribution" given by where is the length of the
ith codeword and
P(
i) is the corresponding symbol's probability. If the actual message probabilities are
Q(
i) and
Kullback–Leibler divergence D_\text{KL}(Q \| P) is minimized by the code with , then the optimal Huffman code for that set of messages will be equivalent to that code. Likewise, how close a code is to optimal can be measured by this divergence. Since universal codes are simpler and faster to encode and decode than Huffman codes (which is, in turn, simpler and faster than
arithmetic encoding), the universal code would be preferable in cases where D_\text{KL}(Q \| P) is sufficiently small. Lossless Data Compression Program: Hybrid LZ77 RLE For any
geometric distribution (an exponential distribution on integers), a Golomb code is optimal. With universal codes, the implicit distribution is approximately a
power law such as 1/n^2 (more precisely, a
Zipf distribution). For the
Fibonacci code, the implicit distribution is approximately 1/n^q, with :q = 1/\log_2(\varphi) \simeq 1.44, where \varphi is the
golden ratio. For the ternary
comma code (i.e., encoding in base 3, represented with 2 bits per symbol), the implicit distribution is a power law with q=1+\log_3(4/3) \simeq 1.26. These distributions thus have near-optimal codes with their respective power laws. ==External links==