On calibration of kullback-leibler divergence via prediction

In this paper we evaluate mean Kullback-Leibler divergence via predicting densities arising from various prediction methods applied to the multivariate single-sample normal model. We demonstrate that the degrees of freedom which index Geisser-Cornfield predictive densities are helpful in divergence...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Communications in statistics. Theory and methods 1999-01, Vol.28 (1), p.67-85
Hauptverfasser: Keyes, Tim K., Levy, Martin S.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper we evaluate mean Kullback-Leibler divergence via predicting densities arising from various prediction methods applied to the multivariate single-sample normal model. We demonstrate that the degrees of freedom which index Geisser-Cornfield predictive densities are helpful in divergence calibration. Alternative calibrations are derived based on sample size considerations. An application of each method to univariate prediction from the gamma model is provided. Comparisons are made with a probability-based calibration method.
ISSN:0361-0926
1532-415X
DOI:10.1080/03610929908832283