Probability estimation in arithmetic and adaptive-Huffman entropy coders

Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 1995-03, Vol.4 (3), p.237-246
Hauptverfasser: Duttweiler, D.L., Chamzas, C.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards.< >
ISSN:1057-7149
1941-0042
DOI:10.1109/83.366473