Nonparametric Maximum Entropy Estimation on Information Diagrams
Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mu...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2016-01 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Martin, Elliot A Hlinka, Jaroslav Meinke, Alexander Děchtěrenko, Filip Davidsen, Jörn |
description | Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We further demonstrate the applicability and advantages of our method to real world systems for the case of resting-state human brain networks. Finally, we show how our method can be used to estimate the structural network connectivity between interacting units from observed activity and establish the advantages over other approaches for the case of phase oscillator networks as a generic example. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2077176637</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2077176637</sourcerecordid><originalsourceid>FETCH-proquest_journals_20771766373</originalsourceid><addsrcrecordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRw8MvPK0gsSsxNLSnKTFbwTazIzC3NVXDNKynKL6hUcC0uycxNLMnMz1MAIs-8tPwiKNclMzEdqK2Yh4E1LTGnOJUXSnMzKLu5hjh76BYU5ReWphaXxGfllxblAaXijQzMzQ3NzcyMzY2JUwUAhow5qw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2077176637</pqid></control><display><type>article</type><title>Nonparametric Maximum Entropy Estimation on Information Diagrams</title><source>Free E- Journals</source><creator>Martin, Elliot A ; Hlinka, Jaroslav ; Meinke, Alexander ; Děchtěrenko, Filip ; Davidsen, Jörn</creator><creatorcontrib>Martin, Elliot A ; Hlinka, Jaroslav ; Meinke, Alexander ; Děchtěrenko, Filip ; Davidsen, Jörn</creatorcontrib><description>Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We further demonstrate the applicability and advantages of our method to real world systems for the case of resting-state human brain networks. Finally, we show how our method can be used to estimate the structural network connectivity between interacting units from observed activity and establish the advantages over other approaches for the case of phase oscillator networks as a generic example.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Bivariate analysis ; Brain ; Continuity (mathematics) ; Entropy ; Information theory ; Maximum entropy ; Random variables</subject><ispartof>arXiv.org, 2016-01</ispartof><rights>2016. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Martin, Elliot A</creatorcontrib><creatorcontrib>Hlinka, Jaroslav</creatorcontrib><creatorcontrib>Meinke, Alexander</creatorcontrib><creatorcontrib>Děchtěrenko, Filip</creatorcontrib><creatorcontrib>Davidsen, Jörn</creatorcontrib><title>Nonparametric Maximum Entropy Estimation on Information Diagrams</title><title>arXiv.org</title><description>Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We further demonstrate the applicability and advantages of our method to real world systems for the case of resting-state human brain networks. Finally, we show how our method can be used to estimate the structural network connectivity between interacting units from observed activity and establish the advantages over other approaches for the case of phase oscillator networks as a generic example.</description><subject>Bivariate analysis</subject><subject>Brain</subject><subject>Continuity (mathematics)</subject><subject>Entropy</subject><subject>Information theory</subject><subject>Maximum entropy</subject><subject>Random variables</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2016</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRw8MvPK0gsSsxNLSnKTFbwTazIzC3NVXDNKynKL6hUcC0uycxNLMnMz1MAIs-8tPwiKNclMzEdqK2Yh4E1LTGnOJUXSnMzKLu5hjh76BYU5ReWphaXxGfllxblAaXijQzMzQ3NzcyMzY2JUwUAhow5qw</recordid><startdate>20160103</startdate><enddate>20160103</enddate><creator>Martin, Elliot A</creator><creator>Hlinka, Jaroslav</creator><creator>Meinke, Alexander</creator><creator>Děchtěrenko, Filip</creator><creator>Davidsen, Jörn</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20160103</creationdate><title>Nonparametric Maximum Entropy Estimation on Information Diagrams</title><author>Martin, Elliot A ; Hlinka, Jaroslav ; Meinke, Alexander ; Děchtěrenko, Filip ; Davidsen, Jörn</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_20771766373</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2016</creationdate><topic>Bivariate analysis</topic><topic>Brain</topic><topic>Continuity (mathematics)</topic><topic>Entropy</topic><topic>Information theory</topic><topic>Maximum entropy</topic><topic>Random variables</topic><toplevel>online_resources</toplevel><creatorcontrib>Martin, Elliot A</creatorcontrib><creatorcontrib>Hlinka, Jaroslav</creatorcontrib><creatorcontrib>Meinke, Alexander</creatorcontrib><creatorcontrib>Děchtěrenko, Filip</creatorcontrib><creatorcontrib>Davidsen, Jörn</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Martin, Elliot A</au><au>Hlinka, Jaroslav</au><au>Meinke, Alexander</au><au>Děchtěrenko, Filip</au><au>Davidsen, Jörn</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Nonparametric Maximum Entropy Estimation on Information Diagrams</atitle><jtitle>arXiv.org</jtitle><date>2016-01-03</date><risdate>2016</risdate><eissn>2331-8422</eissn><abstract>Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We further demonstrate the applicability and advantages of our method to real world systems for the case of resting-state human brain networks. Finally, we show how our method can be used to estimate the structural network connectivity between interacting units from observed activity and establish the advantages over other approaches for the case of phase oscillator networks as a generic example.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2016-01 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2077176637 |
source | Free E- Journals |
subjects | Bivariate analysis Brain Continuity (mathematics) Entropy Information theory Maximum entropy Random variables |
title | Nonparametric Maximum Entropy Estimation on Information Diagrams |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T01%3A17%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Nonparametric%20Maximum%20Entropy%20Estimation%20on%20Information%20Diagrams&rft.jtitle=arXiv.org&rft.au=Martin,%20Elliot%20A&rft.date=2016-01-03&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2077176637%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2077176637&rft_id=info:pmid/&rfr_iscdi=true |