Concentration of information content for convex measures

We establish sharp exponential deviation estimates of the information content as well as a sharp bound on the varentropy for the class of convex measures on Euclidean spaces. This generalizes a similar development for log-concave measures in the recent work of Fradelizi, Madiman and Wang (2016). In...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Electronic journal of probability 2020-01, Vol.25 (none), Article 20
Hauptverfasser: Fradelizi, Matthieu, Li, Jiange, Madiman, Mokshay
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We establish sharp exponential deviation estimates of the information content as well as a sharp bound on the varentropy for the class of convex measures on Euclidean spaces. This generalizes a similar development for log-concave measures in the recent work of Fradelizi, Madiman and Wang (2016). In particular, our results imply that convex measures in high dimension are concentrated in an annulus between two convex sets (as in the log-concave case) despite their possibly having much heavier tails. Various tools and consequences are developed, including a sharp comparison result for Renyi entropies, inequalities of Kahane-Khinchine type for convex measures that extend those of Koldobsky, Pajor and Yaskin (2008) for log-concave measures, and an extension of Berwald's inequality (1947).
ISSN:1083-6489
1083-6489
DOI:10.1214/20-EJP416