Probability estimation in arithmetic and adaptive-Huffman entropy coders

Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 1995-03, Vol.4 (3), p.237-246
Hauptverfasser: Duttweiler, D.L., Chamzas, C.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 246
container_issue 3
container_start_page 237
container_title IEEE transactions on image processing
container_volume 4
creator Duttweiler, D.L.
Chamzas, C.
description Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards.< >
doi_str_mv 10.1109/83.366473
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_pubmed_primary_18289975</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>366473</ieee_id><sourcerecordid>25919692</sourcerecordid><originalsourceid>FETCH-LOGICAL-c392t-944e22b1d83b605579375ea213663f182c2afaa6f123367e36e0456e7e88602e3</originalsourceid><addsrcrecordid>eNqF0UtLw0AQAOBFFFurB68eJAdRPKTu-3GUolYo6EHPYZNMcCWPupsI_fduSbE3Pc3CfMzMziB0TvCcEGzuNJszKbliB2hKDCcpxpwexjcWKlWEmwk6CeETY8IFkcdoQjTVxigxRctX3-U2d7XrNwmE3jW2d12buDax3vUfDfSuSGxbJra06959Q7ocqqqxbQJt77v1Jim6Enw4RUeVrQOc7eIMvT8-vC2W6erl6Xlxv0oLZmifGs6B0pyUmuUSC6EMUwIsJfEDrIpzFdRW1sqKUMakAiYBcyFBgdYSU2AzdDPWXfvua4gTZ40LBdS1baEbQqYYp4oYJqK8_lNSLbjmsc-_UBhipNnC2xEWvgvBQ5WtfVyY32QEZ9tLZJpl4yWivdwVHfIGyr3crT6Cqx2wobB15W1buPDrGJeUGxnZxcgcAOyzY5Mf_2uWZg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>25919692</pqid></control><display><type>article</type><title>Probability estimation in arithmetic and adaptive-Huffman entropy coders</title><source>IEEE Electronic Library (IEL)</source><creator>Duttweiler, D.L. ; Chamzas, C.</creator><creatorcontrib>Duttweiler, D.L. ; Chamzas, C.</creatorcontrib><description>Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards.&lt; &gt;</description><identifier>ISSN: 1057-7149</identifier><identifier>EISSN: 1941-0042</identifier><identifier>DOI: 10.1109/83.366473</identifier><identifier>PMID: 18289975</identifier><identifier>CODEN: IIPRE4</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Applied sciences ; Arithmetic ; Coding, codes ; Decoding ; Entropy coding ; Exact sciences and technology ; Hardware ; Image coding ; Information, signal and communications theory ; Probability ; Random variables ; Signal and communications theory ; Standards development ; Statistics ; Telecommunications and information theory ; Transform coding</subject><ispartof>IEEE transactions on image processing, 1995-03, Vol.4 (3), p.237-246</ispartof><rights>1995 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c392t-944e22b1d83b605579375ea213663f182c2afaa6f123367e36e0456e7e88602e3</citedby><cites>FETCH-LOGICAL-c392t-944e22b1d83b605579375ea213663f182c2afaa6f123367e36e0456e7e88602e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/366473$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/366473$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=3462496$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/18289975$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Duttweiler, D.L.</creatorcontrib><creatorcontrib>Chamzas, C.</creatorcontrib><title>Probability estimation in arithmetic and adaptive-Huffman entropy coders</title><title>IEEE transactions on image processing</title><addtitle>TIP</addtitle><addtitle>IEEE Trans Image Process</addtitle><description>Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards.&lt; &gt;</description><subject>Applied sciences</subject><subject>Arithmetic</subject><subject>Coding, codes</subject><subject>Decoding</subject><subject>Entropy coding</subject><subject>Exact sciences and technology</subject><subject>Hardware</subject><subject>Image coding</subject><subject>Information, signal and communications theory</subject><subject>Probability</subject><subject>Random variables</subject><subject>Signal and communications theory</subject><subject>Standards development</subject><subject>Statistics</subject><subject>Telecommunications and information theory</subject><subject>Transform coding</subject><issn>1057-7149</issn><issn>1941-0042</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1995</creationdate><recordtype>article</recordtype><recordid>eNqF0UtLw0AQAOBFFFurB68eJAdRPKTu-3GUolYo6EHPYZNMcCWPupsI_fduSbE3Pc3CfMzMziB0TvCcEGzuNJszKbliB2hKDCcpxpwexjcWKlWEmwk6CeETY8IFkcdoQjTVxigxRctX3-U2d7XrNwmE3jW2d12buDax3vUfDfSuSGxbJra06959Q7ocqqqxbQJt77v1Jim6Enw4RUeVrQOc7eIMvT8-vC2W6erl6Xlxv0oLZmifGs6B0pyUmuUSC6EMUwIsJfEDrIpzFdRW1sqKUMakAiYBcyFBgdYSU2AzdDPWXfvua4gTZ40LBdS1baEbQqYYp4oYJqK8_lNSLbjmsc-_UBhipNnC2xEWvgvBQ5WtfVyY32QEZ9tLZJpl4yWivdwVHfIGyr3crT6Cqx2wobB15W1buPDrGJeUGxnZxcgcAOyzY5Mf_2uWZg</recordid><startdate>19950301</startdate><enddate>19950301</enddate><creator>Duttweiler, D.L.</creator><creator>Chamzas, C.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><scope>IQODW</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope></search><sort><creationdate>19950301</creationdate><title>Probability estimation in arithmetic and adaptive-Huffman entropy coders</title><author>Duttweiler, D.L. ; Chamzas, C.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c392t-944e22b1d83b605579375ea213663f182c2afaa6f123367e36e0456e7e88602e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1995</creationdate><topic>Applied sciences</topic><topic>Arithmetic</topic><topic>Coding, codes</topic><topic>Decoding</topic><topic>Entropy coding</topic><topic>Exact sciences and technology</topic><topic>Hardware</topic><topic>Image coding</topic><topic>Information, signal and communications theory</topic><topic>Probability</topic><topic>Random variables</topic><topic>Signal and communications theory</topic><topic>Standards development</topic><topic>Statistics</topic><topic>Telecommunications and information theory</topic><topic>Transform coding</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Duttweiler, D.L.</creatorcontrib><creatorcontrib>Chamzas, C.</creatorcontrib><collection>Pascal-Francis</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on image processing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Duttweiler, D.L.</au><au>Chamzas, C.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Probability estimation in arithmetic and adaptive-Huffman entropy coders</atitle><jtitle>IEEE transactions on image processing</jtitle><stitle>TIP</stitle><addtitle>IEEE Trans Image Process</addtitle><date>1995-03-01</date><risdate>1995</risdate><volume>4</volume><issue>3</issue><spage>237</spage><epage>246</epage><pages>237-246</pages><issn>1057-7149</issn><eissn>1941-0042</eissn><coden>IIPRE4</coden><abstract>Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards.&lt; &gt;</abstract><cop>New York, NY</cop><pub>IEEE</pub><pmid>18289975</pmid><doi>10.1109/83.366473</doi><tpages>10</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1057-7149
ispartof IEEE transactions on image processing, 1995-03, Vol.4 (3), p.237-246
issn 1057-7149
1941-0042
language eng
recordid cdi_pubmed_primary_18289975
source IEEE Electronic Library (IEL)
subjects Applied sciences
Arithmetic
Coding, codes
Decoding
Entropy coding
Exact sciences and technology
Hardware
Image coding
Information, signal and communications theory
Probability
Random variables
Signal and communications theory
Standards development
Statistics
Telecommunications and information theory
Transform coding
title Probability estimation in arithmetic and adaptive-Huffman entropy coders
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T08%3A45%3A26IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Probability%20estimation%20in%20arithmetic%20and%20adaptive-Huffman%20entropy%20coders&rft.jtitle=IEEE%20transactions%20on%20image%20processing&rft.au=Duttweiler,%20D.L.&rft.date=1995-03-01&rft.volume=4&rft.issue=3&rft.spage=237&rft.epage=246&rft.pages=237-246&rft.issn=1057-7149&rft.eissn=1941-0042&rft.coden=IIPRE4&rft_id=info:doi/10.1109/83.366473&rft_dat=%3Cproquest_RIE%3E25919692%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=25919692&rft_id=info:pmid/18289975&rft_ieee_id=366473&rfr_iscdi=true