Generalized mutual information and their reference priors under Csizar f-divergence
In Bayesian theory, the role of information is central. The influence exerted by prior information on posterior outcomes often jeopardizes Bayesian studies, due to the potentially subjective nature of the prior choice. In modeling where a priori knowledge is lacking, the reference prior theory emerg...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In Bayesian theory, the role of information is central. The influence exerted
by prior information on posterior outcomes often jeopardizes Bayesian studies,
due to the potentially subjective nature of the prior choice. In modeling where
a priori knowledge is lacking, the reference prior theory emerges as a
proficient tool. Based on the criterion of mutual information, this theory
makes it possible to construct a non-informative prior whose choice can be
qualified as objective. In this paper, we contribute to the enrichment of
reference prior theory. Indeed, we unveil an original analogy between reference
prior theory and Global Sensitivity Analysis, from which we propose a natural
generalization of the mutual information definition. Leveraging dissimilarity
measures between probability distributions, such as f-divergences, we provide a
formalized framework for what we term generalized reference priors. Our main
result offers a limit of mutual information, simplifying the definition of
reference priors as its maximal arguments. This approach opens a new way that
facilitates the theoretical derivation of reference priors under constraints or
within specific classes. In the absence of constraints, we further prove that
the Jeffreys prior maximizes the generalized mutual information considered. |
---|---|
DOI: | 10.48550/arxiv.2310.10530 |