An exponential inequality for the distribution function of the kernel density estimator, with applications to adaptive estimation

It is shown that the uniform distance between the distribution function of the usual kernel density estimator (based on an i.i.d. sample from an absolutely continuous law on ) with bandwidth h and the empirical distribution function F n satisfies an exponential inequality. This inequality is used to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Probability theory and related fields 2009-03, Vol.143 (3-4), p.569-596
Hauptverfasser: Giné, Evarist, Nickl, Richard
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:It is shown that the uniform distance between the distribution function of the usual kernel density estimator (based on an i.i.d. sample from an absolutely continuous law on ) with bandwidth h and the empirical distribution function F n satisfies an exponential inequality. This inequality is used to obtain sharp almost sure rates of convergence of under mild conditions on the range of bandwidths h n , including the usual MISE-optimal choices. Another application is a Dvoretzky–Kiefer–Wolfowitz-type inequality for , where F is the true distribution function. The exponential bound is also applied to show that an adaptive estimator can be constructed that efficiently estimates the true distribution function F in sup-norm loss, and, at the same time, estimates the density of F —if it exists (but without assuming it does)—at the best possible rate of convergence over Hölder-balls, again in sup-norm loss.
ISSN:0178-8051
1432-2064
DOI:10.1007/s00440-008-0137-y