Mixture-based estimation of entropy
The entropy is a measure of uncertainty that plays a central role in information theory. When the distribution of the data is unknown, an estimate of the entropy needs to be obtained from the data sample itself. A semi-parametric estimate is proposed based on a mixture model approximation of the dis...
Gespeichert in:
Veröffentlicht in: | Computational statistics & data analysis 2023-01, Vol.177, p.107582, Article 107582 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | 107582 |
container_title | Computational statistics & data analysis |
container_volume | 177 |
creator | Robin, Stéphane Scrucca, Luca |
description | The entropy is a measure of uncertainty that plays a central role in information theory. When the distribution of the data is unknown, an estimate of the entropy needs to be obtained from the data sample itself. A semi-parametric estimate is proposed based on a mixture model approximation of the distribution of interest. A Gaussian mixture model is used to illustrate the accuracy and versatility of the proposal, although the estimate can rely on any type of mixture. Performance of the proposed approach is assessed through a series of simulation studies. Two real-life data examples are also provided to illustrate its use. |
doi_str_mv | 10.1016/j.csda.2022.107582 |
format | Article |
fullrecord | <record><control><sourceid>proquest_hal_p</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_03895041v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0167947322001621</els_id><sourcerecordid>2718342930</sourcerecordid><originalsourceid>FETCH-LOGICAL-c411t-fbc7b1e73c405ba108ac95b2700737c4b8b5e24c51a8472dd9951b4606afd8533</originalsourceid><addsrcrecordid>eNp9kE9LAzEUxIMoWKtfwFPBix62vvxrsuClFLVCxYueQ5LNYsp2U5PdYr-9WVY8enow_GaYNwhdY5hjwIv77dymSs8JEJIFwSU5QRMsBSkE5eQUTTIkipIJeo4uUtoCAGFCTtDNq__u-ugKo5OrZi51fqc7H9pZqGeu7WLYHy_RWa2b5K5-7xR9PD2-r9bF5u35ZbXcFJZh3BW1scJgJ6hlwI3GILUtuSECQFBhmZGGO8Isx1oyQaqqLDk2bAELXVeSUzpFd2Pup27UPuYi8aiC9mq93KhBAypLDgwfcGZvR3Yfw1efa6udT9Y1jW5d6JMiAkvKSEkho2REbQwpRVf_ZWNQw3pqq4b11LCeGtfLpofR5PLDB--iSta71rrKR2c7VQX_n_0HBtd1kg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2718342930</pqid></control><display><type>article</type><title>Mixture-based estimation of entropy</title><source>Elsevier ScienceDirect Journals</source><creator>Robin, Stéphane ; Scrucca, Luca</creator><creatorcontrib>Robin, Stéphane ; Scrucca, Luca</creatorcontrib><description>The entropy is a measure of uncertainty that plays a central role in information theory. When the distribution of the data is unknown, an estimate of the entropy needs to be obtained from the data sample itself. A semi-parametric estimate is proposed based on a mixture model approximation of the distribution of interest. A Gaussian mixture model is used to illustrate the accuracy and versatility of the proposal, although the estimate can rely on any type of mixture. Performance of the proposed approach is assessed through a series of simulation studies. Two real-life data examples are also provided to illustrate its use.</description><identifier>ISSN: 0167-9473</identifier><identifier>EISSN: 1872-7352</identifier><identifier>DOI: 10.1016/j.csda.2022.107582</identifier><language>eng</language><publisher>Elsevier B.V</publisher><subject>data analysis ; entropy ; Entropy estimation ; Gaussian mixtures ; mathematical theory ; Mathematics ; Mixture models ; Mutual information ; uncertainty</subject><ispartof>Computational statistics & data analysis, 2023-01, Vol.177, p.107582, Article 107582</ispartof><rights>2022 Elsevier B.V.</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c411t-fbc7b1e73c405ba108ac95b2700737c4b8b5e24c51a8472dd9951b4606afd8533</citedby><cites>FETCH-LOGICAL-c411t-fbc7b1e73c405ba108ac95b2700737c4b8b5e24c51a8472dd9951b4606afd8533</cites><orcidid>0000-0003-3826-0484 ; 0000-0003-1045-069X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0167947322001621$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>230,314,776,780,881,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://hal.science/hal-03895041$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Robin, Stéphane</creatorcontrib><creatorcontrib>Scrucca, Luca</creatorcontrib><title>Mixture-based estimation of entropy</title><title>Computational statistics & data analysis</title><description>The entropy is a measure of uncertainty that plays a central role in information theory. When the distribution of the data is unknown, an estimate of the entropy needs to be obtained from the data sample itself. A semi-parametric estimate is proposed based on a mixture model approximation of the distribution of interest. A Gaussian mixture model is used to illustrate the accuracy and versatility of the proposal, although the estimate can rely on any type of mixture. Performance of the proposed approach is assessed through a series of simulation studies. Two real-life data examples are also provided to illustrate its use.</description><subject>data analysis</subject><subject>entropy</subject><subject>Entropy estimation</subject><subject>Gaussian mixtures</subject><subject>mathematical theory</subject><subject>Mathematics</subject><subject>Mixture models</subject><subject>Mutual information</subject><subject>uncertainty</subject><issn>0167-9473</issn><issn>1872-7352</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kE9LAzEUxIMoWKtfwFPBix62vvxrsuClFLVCxYueQ5LNYsp2U5PdYr-9WVY8enow_GaYNwhdY5hjwIv77dymSs8JEJIFwSU5QRMsBSkE5eQUTTIkipIJeo4uUtoCAGFCTtDNq__u-ugKo5OrZi51fqc7H9pZqGeu7WLYHy_RWa2b5K5-7xR9PD2-r9bF5u35ZbXcFJZh3BW1scJgJ6hlwI3GILUtuSECQFBhmZGGO8Isx1oyQaqqLDk2bAELXVeSUzpFd2Pup27UPuYi8aiC9mq93KhBAypLDgwfcGZvR3Yfw1efa6udT9Y1jW5d6JMiAkvKSEkho2REbQwpRVf_ZWNQw3pqq4b11LCeGtfLpofR5PLDB--iSta71rrKR2c7VQX_n_0HBtd1kg</recordid><startdate>202301</startdate><enddate>202301</enddate><creator>Robin, Stéphane</creator><creator>Scrucca, Luca</creator><general>Elsevier B.V</general><general>Elsevier</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7S9</scope><scope>L.6</scope><scope>1XC</scope><orcidid>https://orcid.org/0000-0003-3826-0484</orcidid><orcidid>https://orcid.org/0000-0003-1045-069X</orcidid></search><sort><creationdate>202301</creationdate><title>Mixture-based estimation of entropy</title><author>Robin, Stéphane ; Scrucca, Luca</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c411t-fbc7b1e73c405ba108ac95b2700737c4b8b5e24c51a8472dd9951b4606afd8533</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>data analysis</topic><topic>entropy</topic><topic>Entropy estimation</topic><topic>Gaussian mixtures</topic><topic>mathematical theory</topic><topic>Mathematics</topic><topic>Mixture models</topic><topic>Mutual information</topic><topic>uncertainty</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Robin, Stéphane</creatorcontrib><creatorcontrib>Scrucca, Luca</creatorcontrib><collection>CrossRef</collection><collection>AGRICOLA</collection><collection>AGRICOLA - Academic</collection><collection>Hyper Article en Ligne (HAL)</collection><jtitle>Computational statistics & data analysis</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Robin, Stéphane</au><au>Scrucca, Luca</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Mixture-based estimation of entropy</atitle><jtitle>Computational statistics & data analysis</jtitle><date>2023-01</date><risdate>2023</risdate><volume>177</volume><spage>107582</spage><pages>107582-</pages><artnum>107582</artnum><issn>0167-9473</issn><eissn>1872-7352</eissn><abstract>The entropy is a measure of uncertainty that plays a central role in information theory. When the distribution of the data is unknown, an estimate of the entropy needs to be obtained from the data sample itself. A semi-parametric estimate is proposed based on a mixture model approximation of the distribution of interest. A Gaussian mixture model is used to illustrate the accuracy and versatility of the proposal, although the estimate can rely on any type of mixture. Performance of the proposed approach is assessed through a series of simulation studies. Two real-life data examples are also provided to illustrate its use.</abstract><pub>Elsevier B.V</pub><doi>10.1016/j.csda.2022.107582</doi><orcidid>https://orcid.org/0000-0003-3826-0484</orcidid><orcidid>https://orcid.org/0000-0003-1045-069X</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0167-9473 |
ispartof | Computational statistics & data analysis, 2023-01, Vol.177, p.107582, Article 107582 |
issn | 0167-9473 1872-7352 |
language | eng |
recordid | cdi_hal_primary_oai_HAL_hal_03895041v1 |
source | Elsevier ScienceDirect Journals |
subjects | data analysis entropy Entropy estimation Gaussian mixtures mathematical theory Mathematics Mixture models Mutual information uncertainty |
title | Mixture-based estimation of entropy |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T23%3A25%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Mixture-based%20estimation%20of%20entropy&rft.jtitle=Computational%20statistics%20&%20data%20analysis&rft.au=Robin,%20St%C3%A9phane&rft.date=2023-01&rft.volume=177&rft.spage=107582&rft.pages=107582-&rft.artnum=107582&rft.issn=0167-9473&rft.eissn=1872-7352&rft_id=info:doi/10.1016/j.csda.2022.107582&rft_dat=%3Cproquest_hal_p%3E2718342930%3C/proquest_hal_p%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2718342930&rft_id=info:pmid/&rft_els_id=S0167947322001621&rfr_iscdi=true |