Information Geometry of Generalized Bayesian Prediction Using $\alpha$ -Divergences as Loss Functions
In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality, and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predict...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on information theory 2018-03, Vol.64 (3), p.1812-1824 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality, and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predictive densities minimize a generalized Bayes risk. We also find that the asymptotic expansions of the densities are related to the coefficients of the α-connections of a statistical manifold. In addition, we discuss the difference between two risk functions of the generalized Bayesian predictions based on different priors. Finally, using the non-informative priors (i.e., Jeffreys and reference priors), uniform prior, and conjugate prior, two examples are presented to illustrate the main results. |
---|---|
ISSN: | 0018-9448 1557-9654 |
DOI: | 10.1109/TIT.2017.2774820 |