Escort distributions minimizing the Kullback–Leibler divergence for a large deviations principle and tests of entropy level

Kullback–Leibler divergence is minimized among finite distributions with finite state spaces under various constraints of Shannon entropy. Minimization is closely linked to escort distributions whose main properties related to entropy are proven. This allows a large deviations principle to be stated...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Annals of the Institute of Statistical Mathematics 2016-04, Vol.68 (2), p.439-468
Hauptverfasser: Girardin, Valérie, Regnault, Philippe
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Kullback–Leibler divergence is minimized among finite distributions with finite state spaces under various constraints of Shannon entropy. Minimization is closely linked to escort distributions whose main properties related to entropy are proven. This allows a large deviations principle to be stated for the sequence of plug-in empirical estimators of Shannon entropy of any finite distributions. Since no closed-form expression of the rate function can be obtained, an explicit approximating function is constructed. This approximation is accurate enough to provide good results in all applications. Tests of entropy level, using both the large deviations principle and the minimization results, are constructed and shown to have a good behavior in terms of errors.
ISSN:0020-3157
1572-9052
DOI:10.1007/s10463-014-0501-x