Bayes Posterior Convergence for Loss Functions via Almost Additive Thermodynamic Formalism
Statistical inference can be seen as information processing involving input information and output information that updates belief about some unknown parameters. We consider the Bayesian framework for making inferences about dynamical systems from ergodic observations, where the Bayesian procedure i...
Gespeichert in:
Veröffentlicht in: | Journal of statistical physics 2022, Vol.186 (3), p.35-35, Article 35 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Statistical inference can be seen as information processing involving input information and output information that updates belief about some unknown parameters. We consider the Bayesian framework for making inferences about dynamical systems from ergodic observations, where the Bayesian procedure is based on the Gibbs posterior inference, a decision process generalization of standard Bayesian inference (see [
7
,
37
]) where the likelihood is replaced by the exponential of a loss function. In the case of direct observation and almost-additive loss functions, we prove an exponential convergence of the a posteriori measures to a limit measure. Our estimates on the Bayes posterior convergence for direct observation are related and extend those in [
47
] to a context where loss functions are almost-additive. Our approach makes use of non-additive thermodynamic formalism and large deviation properties [
39
,
40
,
57
] instead of joinings. |
---|---|
ISSN: | 0022-4715 1572-9613 |
DOI: | 10.1007/s10955-022-02885-8 |