Information Geometry of Generalized Bayesian Prediction Using $\alpha$ -Divergences as Loss Functions

In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality, and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predict...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2018-03, Vol.64 (3), p.1812-1824
Hauptverfasser: Zhang, Fode, Shi, Yimin, Ng, Hon Keung Tony, Wang, Ruibing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1824
container_issue 3
container_start_page 1812
container_title IEEE transactions on information theory
container_volume 64
creator Zhang, Fode
Shi, Yimin
Ng, Hon Keung Tony
Wang, Ruibing
description In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality, and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predictive densities minimize a generalized Bayes risk. We also find that the asymptotic expansions of the densities are related to the coefficients of the α-connections of a statistical manifold. In addition, we discuss the difference between two risk functions of the generalized Bayesian predictions based on different priors. Finally, using the non-informative priors (i.e., Jeffreys and reference priors), uniform prior, and conjugate prior, two examples are presented to illustrate the main results.
doi_str_mv 10.1109/TIT.2017.2774820
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2006249625</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2006249625</sourcerecordid><originalsourceid>FETCH-LOGICAL-c337t-ffa8f54a30cd7df48c569bf714d0be87a2b1a86a9cf7e383cf9ae67bb8d2853</originalsourceid><addsrcrecordid>eNotkE1LAzEQhoMoWKt3jwF73Zpks5vsUWtbCwUF600Is9lJ3dJma7IV6q93-3GaGXiY9-Uh5J6zIeeseFzMFkPBuBoKpaQW7IL0eJappMgzeUl6jHGdFFLqa3IT46o7ZcZFj-DMuyZsoK0bT6fYbLANe9q4bvcYYF3_YUWfYY-xBk_fA1a1PbKfsfZLOviC9fYbBjR5qX8xLNFbjBQinTcx0snOH-F4S64crCPenWeffEzGi9FrMn-bzkZP88SmqWoT50C7TELKbKUqJ7XN8qJ0isuKlagViJKDzqGwTmGqU-sKwFyVpa6EztI-eTh93YbmZ4exNatmF3wXaARjuZBFLg4UO1E2dB0DOrMN9QbC3nBmDipNp9IcVJqzyvQfvFNoNQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2006249625</pqid></control><display><type>article</type><title>Information Geometry of Generalized Bayesian Prediction Using $\alpha$ -Divergences as Loss Functions</title><source>IEEE Electronic Library (IEL)</source><creator>Zhang, Fode ; Shi, Yimin ; Ng, Hon Keung Tony ; Wang, Ruibing</creator><creatorcontrib>Zhang, Fode ; Shi, Yimin ; Ng, Hon Keung Tony ; Wang, Ruibing</creatorcontrib><description>In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality, and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predictive densities minimize a generalized Bayes risk. We also find that the asymptotic expansions of the densities are related to the coefficients of the α-connections of a statistical manifold. In addition, we discuss the difference between two risk functions of the generalized Bayesian predictions based on different priors. Finally, using the non-informative priors (i.e., Jeffreys and reference priors), uniform prior, and conjugate prior, two examples are presented to illustrate the main results.</description><identifier>ISSN: 0018-9448</identifier><identifier>EISSN: 1557-9654</identifier><identifier>DOI: 10.1109/TIT.2017.2774820</identifier><language>eng</language><publisher>New York: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</publisher><subject>Asymptotic properties ; Asymptotic series ; Bayesian analysis ; Predictions</subject><ispartof>IEEE transactions on information theory, 2018-03, Vol.64 (3), p.1812-1824</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c337t-ffa8f54a30cd7df48c569bf714d0be87a2b1a86a9cf7e383cf9ae67bb8d2853</citedby><cites>FETCH-LOGICAL-c337t-ffa8f54a30cd7df48c569bf714d0be87a2b1a86a9cf7e383cf9ae67bb8d2853</cites><orcidid>0000-0003-4685-2199 ; 0000-0002-4733-9752</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Zhang, Fode</creatorcontrib><creatorcontrib>Shi, Yimin</creatorcontrib><creatorcontrib>Ng, Hon Keung Tony</creatorcontrib><creatorcontrib>Wang, Ruibing</creatorcontrib><title>Information Geometry of Generalized Bayesian Prediction Using $\alpha$ -Divergences as Loss Functions</title><title>IEEE transactions on information theory</title><description>In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality, and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predictive densities minimize a generalized Bayes risk. We also find that the asymptotic expansions of the densities are related to the coefficients of the α-connections of a statistical manifold. In addition, we discuss the difference between two risk functions of the generalized Bayesian predictions based on different priors. Finally, using the non-informative priors (i.e., Jeffreys and reference priors), uniform prior, and conjugate prior, two examples are presented to illustrate the main results.</description><subject>Asymptotic properties</subject><subject>Asymptotic series</subject><subject>Bayesian analysis</subject><subject>Predictions</subject><issn>0018-9448</issn><issn>1557-9654</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><recordid>eNotkE1LAzEQhoMoWKt3jwF73Zpks5vsUWtbCwUF600Is9lJ3dJma7IV6q93-3GaGXiY9-Uh5J6zIeeseFzMFkPBuBoKpaQW7IL0eJappMgzeUl6jHGdFFLqa3IT46o7ZcZFj-DMuyZsoK0bT6fYbLANe9q4bvcYYF3_YUWfYY-xBk_fA1a1PbKfsfZLOviC9fYbBjR5qX8xLNFbjBQinTcx0snOH-F4S64crCPenWeffEzGi9FrMn-bzkZP88SmqWoT50C7TELKbKUqJ7XN8qJ0isuKlagViJKDzqGwTmGqU-sKwFyVpa6EztI-eTh93YbmZ4exNatmF3wXaARjuZBFLg4UO1E2dB0DOrMN9QbC3nBmDipNp9IcVJqzyvQfvFNoNQ</recordid><startdate>20180301</startdate><enddate>20180301</enddate><creator>Zhang, Fode</creator><creator>Shi, Yimin</creator><creator>Ng, Hon Keung Tony</creator><creator>Wang, Ruibing</creator><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-4685-2199</orcidid><orcidid>https://orcid.org/0000-0002-4733-9752</orcidid></search><sort><creationdate>20180301</creationdate><title>Information Geometry of Generalized Bayesian Prediction Using $\alpha$ -Divergences as Loss Functions</title><author>Zhang, Fode ; Shi, Yimin ; Ng, Hon Keung Tony ; Wang, Ruibing</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c337t-ffa8f54a30cd7df48c569bf714d0be87a2b1a86a9cf7e383cf9ae67bb8d2853</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Asymptotic properties</topic><topic>Asymptotic series</topic><topic>Bayesian analysis</topic><topic>Predictions</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Fode</creatorcontrib><creatorcontrib>Shi, Yimin</creatorcontrib><creatorcontrib>Ng, Hon Keung Tony</creatorcontrib><creatorcontrib>Wang, Ruibing</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on information theory</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhang, Fode</au><au>Shi, Yimin</au><au>Ng, Hon Keung Tony</au><au>Wang, Ruibing</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Information Geometry of Generalized Bayesian Prediction Using $\alpha$ -Divergences as Loss Functions</atitle><jtitle>IEEE transactions on information theory</jtitle><date>2018-03-01</date><risdate>2018</risdate><volume>64</volume><issue>3</issue><spage>1812</spage><epage>1824</epage><pages>1812-1824</pages><issn>0018-9448</issn><eissn>1557-9654</eissn><abstract>In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality, and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predictive densities minimize a generalized Bayes risk. We also find that the asymptotic expansions of the densities are related to the coefficients of the α-connections of a statistical manifold. In addition, we discuss the difference between two risk functions of the generalized Bayesian predictions based on different priors. Finally, using the non-informative priors (i.e., Jeffreys and reference priors), uniform prior, and conjugate prior, two examples are presented to illustrate the main results.</abstract><cop>New York</cop><pub>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</pub><doi>10.1109/TIT.2017.2774820</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0003-4685-2199</orcidid><orcidid>https://orcid.org/0000-0002-4733-9752</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0018-9448
ispartof IEEE transactions on information theory, 2018-03, Vol.64 (3), p.1812-1824
issn 0018-9448
1557-9654
language eng
recordid cdi_proquest_journals_2006249625
source IEEE Electronic Library (IEL)
subjects Asymptotic properties
Asymptotic series
Bayesian analysis
Predictions
title Information Geometry of Generalized Bayesian Prediction Using $\alpha$ -Divergences as Loss Functions
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-21T23%3A05%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Information%20Geometry%20of%20Generalized%20Bayesian%20Prediction%20Using%20$%5Calpha$%20-Divergences%20as%20Loss%20Functions&rft.jtitle=IEEE%20transactions%20on%20information%20theory&rft.au=Zhang,%20Fode&rft.date=2018-03-01&rft.volume=64&rft.issue=3&rft.spage=1812&rft.epage=1824&rft.pages=1812-1824&rft.issn=0018-9448&rft.eissn=1557-9654&rft_id=info:doi/10.1109/TIT.2017.2774820&rft_dat=%3Cproquest_cross%3E2006249625%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2006249625&rft_id=info:pmid/&rfr_iscdi=true