The Differential Entropy of Mixtures: New Bounds and Applications

Mixture distributions are extensively used as a modeling tool in diverse areas from machine learning to communications engineering to physics, and obtaining bounds on the entropy of mixture distributions is of fundamental importance in many of these applications. This article provides sharp bounds o...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2022-04, Vol.68 (4), p.2123-2146
Hauptverfasser: Melbourne, James, Talukdar, Saurav, Bhaban, Shreyas, Madiman, Mokshay, Salapaka, Murti V.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 2146
container_issue 4
container_start_page 2123
container_title IEEE transactions on information theory
container_volume 68
creator Melbourne, James
Talukdar, Saurav
Bhaban, Shreyas
Madiman, Mokshay
Salapaka, Murti V.
description Mixture distributions are extensively used as a modeling tool in diverse areas from machine learning to communications engineering to physics, and obtaining bounds on the entropy of mixture distributions is of fundamental importance in many of these applications. This article provides sharp bounds on the entropy concavity deficit, which is the difference between the differential entropy of the mixture and the weighted sum of differential entropies of constituent components. Toward establishing lower and upper bounds on the concavity deficit, results that are of importance in their own right are obtained. In order to obtain nontrivial upper bounds, properties of the skew-divergence are developed and notions of "skew" f -divergences are introduced; a reverse Pinsker inequality and a bound on Jensen-Shannon divergence are obtained along the way. Complementary lower bounds are derived with special attention paid to the case that corresponds to independent summation of a continuous and a discrete random variable. Several applications of the bounds are delineated, including to mutual information of additive noise channels, thermodynamics of computation, and functional inequalities.
doi_str_mv 10.1109/TIT.2022.3140661
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9672100</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9672100</ieee_id><sourcerecordid>2640430815</sourcerecordid><originalsourceid>FETCH-LOGICAL-c291t-e85c4b83dc8d6d18d8b7954ed740d2fe264ec089779e515f447301e0fae8b2d03</originalsourceid><addsrcrecordid>eNo9kEFLwzAUgIMoOKd3wUvAc-d7adIm3uacOph6qefQNS_YMduadOj-vR0bnh4Pvu89-Bi7RpgggrkrFsVEgBCTFCVkGZ6wESqVJyZT8pSNAFAnRkp9zi5iXA-rVChGbFp8En-svadATV-XGz5v-tB2O956_lr_9ttA8Z6_0Q9_aLeNi7xsHJ923aauyr5um3jJzny5iXR1nGP28TQvZi_J8v15MZsuk0oY7BPSqpIrnbpKu8yhdnqVGyXJ5RKc8CQySRVok-eGFCovZZ4CEviS9Eo4SMfs9nC3C-33lmJv1-02NMNLO7ggU9CoBgoOVBXaGAN524X6qww7i2D3oewQyu5D2WOoQbk5KDUR_eMmywUCpH_ejmL1</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2640430815</pqid></control><display><type>article</type><title>The Differential Entropy of Mixtures: New Bounds and Applications</title><source>IEEE Electronic Library (IEL)</source><creator>Melbourne, James ; Talukdar, Saurav ; Bhaban, Shreyas ; Madiman, Mokshay ; Salapaka, Murti V.</creator><creatorcontrib>Melbourne, James ; Talukdar, Saurav ; Bhaban, Shreyas ; Madiman, Mokshay ; Salapaka, Murti V.</creatorcontrib><description>Mixture distributions are extensively used as a modeling tool in diverse areas from machine learning to communications engineering to physics, and obtaining bounds on the entropy of mixture distributions is of fundamental importance in many of these applications. This article provides sharp bounds on the entropy concavity deficit, which is the difference between the differential entropy of the mixture and the weighted sum of differential entropies of constituent components. Toward establishing lower and upper bounds on the concavity deficit, results that are of importance in their own right are obtained. In order to obtain nontrivial upper bounds, properties of the skew-divergence are developed and notions of "skew" &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;f &lt;/tex-math&gt;&lt;/inline-formula&gt;-divergences are introduced; a reverse Pinsker inequality and a bound on Jensen-Shannon divergence are obtained along the way. Complementary lower bounds are derived with special attention paid to the case that corresponds to independent summation of a continuous and a discrete random variable. Several applications of the bounds are delineated, including to mutual information of additive noise channels, thermodynamics of computation, and functional inequalities.</description><identifier>ISSN: 0018-9448</identifier><identifier>EISSN: 1557-9654</identifier><identifier>DOI: 10.1109/TIT.2022.3140661</identifier><identifier>CODEN: IETTAW</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>&lt;italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;f -divergence ; Communications engineering ; Concavity ; differential entropy ; Entropy ; Extraterrestrial measurements ; Lower bounds ; Machine learning ; Mixture distributions ; Mixture models ; Mixtures ; Mutual information ; Random variables ; Thermodynamics ; Upper bound ; Upper bounds</subject><ispartof>IEEE transactions on information theory, 2022-04, Vol.68 (4), p.2123-2146</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c291t-e85c4b83dc8d6d18d8b7954ed740d2fe264ec089779e515f447301e0fae8b2d03</citedby><cites>FETCH-LOGICAL-c291t-e85c4b83dc8d6d18d8b7954ed740d2fe264ec089779e515f447301e0fae8b2d03</cites><orcidid>0000-0002-2992-1829 ; 0000-0002-4595-9683 ; 0000-0002-8651-287X ; 0000-0001-9265-1617 ; 0000-0002-1263-0961</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9672100$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9672100$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Melbourne, James</creatorcontrib><creatorcontrib>Talukdar, Saurav</creatorcontrib><creatorcontrib>Bhaban, Shreyas</creatorcontrib><creatorcontrib>Madiman, Mokshay</creatorcontrib><creatorcontrib>Salapaka, Murti V.</creatorcontrib><title>The Differential Entropy of Mixtures: New Bounds and Applications</title><title>IEEE transactions on information theory</title><addtitle>TIT</addtitle><description>Mixture distributions are extensively used as a modeling tool in diverse areas from machine learning to communications engineering to physics, and obtaining bounds on the entropy of mixture distributions is of fundamental importance in many of these applications. This article provides sharp bounds on the entropy concavity deficit, which is the difference between the differential entropy of the mixture and the weighted sum of differential entropies of constituent components. Toward establishing lower and upper bounds on the concavity deficit, results that are of importance in their own right are obtained. In order to obtain nontrivial upper bounds, properties of the skew-divergence are developed and notions of "skew" &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;f &lt;/tex-math&gt;&lt;/inline-formula&gt;-divergences are introduced; a reverse Pinsker inequality and a bound on Jensen-Shannon divergence are obtained along the way. Complementary lower bounds are derived with special attention paid to the case that corresponds to independent summation of a continuous and a discrete random variable. Several applications of the bounds are delineated, including to mutual information of additive noise channels, thermodynamics of computation, and functional inequalities.</description><subject>&lt;italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;f -divergence</subject><subject>Communications engineering</subject><subject>Concavity</subject><subject>differential entropy</subject><subject>Entropy</subject><subject>Extraterrestrial measurements</subject><subject>Lower bounds</subject><subject>Machine learning</subject><subject>Mixture distributions</subject><subject>Mixture models</subject><subject>Mixtures</subject><subject>Mutual information</subject><subject>Random variables</subject><subject>Thermodynamics</subject><subject>Upper bound</subject><subject>Upper bounds</subject><issn>0018-9448</issn><issn>1557-9654</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kEFLwzAUgIMoOKd3wUvAc-d7adIm3uacOph6qefQNS_YMduadOj-vR0bnh4Pvu89-Bi7RpgggrkrFsVEgBCTFCVkGZ6wESqVJyZT8pSNAFAnRkp9zi5iXA-rVChGbFp8En-svadATV-XGz5v-tB2O956_lr_9ttA8Z6_0Q9_aLeNi7xsHJ923aauyr5um3jJzny5iXR1nGP28TQvZi_J8v15MZsuk0oY7BPSqpIrnbpKu8yhdnqVGyXJ5RKc8CQySRVok-eGFCovZZ4CEviS9Eo4SMfs9nC3C-33lmJv1-02NMNLO7ggU9CoBgoOVBXaGAN524X6qww7i2D3oewQyu5D2WOoQbk5KDUR_eMmywUCpH_ejmL1</recordid><startdate>20220401</startdate><enddate>20220401</enddate><creator>Melbourne, James</creator><creator>Talukdar, Saurav</creator><creator>Bhaban, Shreyas</creator><creator>Madiman, Mokshay</creator><creator>Salapaka, Murti V.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-2992-1829</orcidid><orcidid>https://orcid.org/0000-0002-4595-9683</orcidid><orcidid>https://orcid.org/0000-0002-8651-287X</orcidid><orcidid>https://orcid.org/0000-0001-9265-1617</orcidid><orcidid>https://orcid.org/0000-0002-1263-0961</orcidid></search><sort><creationdate>20220401</creationdate><title>The Differential Entropy of Mixtures: New Bounds and Applications</title><author>Melbourne, James ; Talukdar, Saurav ; Bhaban, Shreyas ; Madiman, Mokshay ; Salapaka, Murti V.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c291t-e85c4b83dc8d6d18d8b7954ed740d2fe264ec089779e515f447301e0fae8b2d03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>&lt;italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;f -divergence</topic><topic>Communications engineering</topic><topic>Concavity</topic><topic>differential entropy</topic><topic>Entropy</topic><topic>Extraterrestrial measurements</topic><topic>Lower bounds</topic><topic>Machine learning</topic><topic>Mixture distributions</topic><topic>Mixture models</topic><topic>Mixtures</topic><topic>Mutual information</topic><topic>Random variables</topic><topic>Thermodynamics</topic><topic>Upper bound</topic><topic>Upper bounds</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Melbourne, James</creatorcontrib><creatorcontrib>Talukdar, Saurav</creatorcontrib><creatorcontrib>Bhaban, Shreyas</creatorcontrib><creatorcontrib>Madiman, Mokshay</creatorcontrib><creatorcontrib>Salapaka, Murti V.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on information theory</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Melbourne, James</au><au>Talukdar, Saurav</au><au>Bhaban, Shreyas</au><au>Madiman, Mokshay</au><au>Salapaka, Murti V.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The Differential Entropy of Mixtures: New Bounds and Applications</atitle><jtitle>IEEE transactions on information theory</jtitle><stitle>TIT</stitle><date>2022-04-01</date><risdate>2022</risdate><volume>68</volume><issue>4</issue><spage>2123</spage><epage>2146</epage><pages>2123-2146</pages><issn>0018-9448</issn><eissn>1557-9654</eissn><coden>IETTAW</coden><abstract>Mixture distributions are extensively used as a modeling tool in diverse areas from machine learning to communications engineering to physics, and obtaining bounds on the entropy of mixture distributions is of fundamental importance in many of these applications. This article provides sharp bounds on the entropy concavity deficit, which is the difference between the differential entropy of the mixture and the weighted sum of differential entropies of constituent components. Toward establishing lower and upper bounds on the concavity deficit, results that are of importance in their own right are obtained. In order to obtain nontrivial upper bounds, properties of the skew-divergence are developed and notions of "skew" &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;f &lt;/tex-math&gt;&lt;/inline-formula&gt;-divergences are introduced; a reverse Pinsker inequality and a bound on Jensen-Shannon divergence are obtained along the way. Complementary lower bounds are derived with special attention paid to the case that corresponds to independent summation of a continuous and a discrete random variable. Several applications of the bounds are delineated, including to mutual information of additive noise channels, thermodynamics of computation, and functional inequalities.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TIT.2022.3140661</doi><tpages>24</tpages><orcidid>https://orcid.org/0000-0002-2992-1829</orcidid><orcidid>https://orcid.org/0000-0002-4595-9683</orcidid><orcidid>https://orcid.org/0000-0002-8651-287X</orcidid><orcidid>https://orcid.org/0000-0001-9265-1617</orcidid><orcidid>https://orcid.org/0000-0002-1263-0961</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0018-9448
ispartof IEEE transactions on information theory, 2022-04, Vol.68 (4), p.2123-2146
issn 0018-9448
1557-9654
language eng
recordid cdi_ieee_primary_9672100
source IEEE Electronic Library (IEL)
subjects <italic xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">f -divergence
Communications engineering
Concavity
differential entropy
Entropy
Extraterrestrial measurements
Lower bounds
Machine learning
Mixture distributions
Mixture models
Mixtures
Mutual information
Random variables
Thermodynamics
Upper bound
Upper bounds
title The Differential Entropy of Mixtures: New Bounds and Applications
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T11%3A05%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20Differential%20Entropy%20of%20Mixtures:%20New%20Bounds%20and%20Applications&rft.jtitle=IEEE%20transactions%20on%20information%20theory&rft.au=Melbourne,%20James&rft.date=2022-04-01&rft.volume=68&rft.issue=4&rft.spage=2123&rft.epage=2146&rft.pages=2123-2146&rft.issn=0018-9448&rft.eissn=1557-9654&rft.coden=IETTAW&rft_id=info:doi/10.1109/TIT.2022.3140661&rft_dat=%3Cproquest_RIE%3E2640430815%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2640430815&rft_id=info:pmid/&rft_ieee_id=9672100&rfr_iscdi=true