Nonparametric density estimation by a self-consistent neural network
An improvement to the classic adaptive kernel estimator has been made by incorporating first order dynamics in a neural network framework that results in a fully self-consistent probability density function (pdf) estimate. The dynamics give rise to nonlinear interactions between the kernel parameter...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | An improvement to the classic adaptive kernel estimator has been made by incorporating first order dynamics in a neural network framework that results in a fully self-consistent probability density function (pdf) estimate. The dynamics give rise to nonlinear interactions between the kernel parameters, resulting in a self-consistent pdf estimate. This is in contrast to the adaptive kernel estimator which is a simple three step procedure. Adaptive kernel estimates have asymptotic convergence rates of O(h/sup 4/) if the errors involved in the pilot estimate can be ignored. This is compared to standard kernel estimators which converge as O(h/sup 2/). By using a fully self-consistent method, this approach is also able to approach the theoretical O(h/sup 4/) convergence rate while providing smoother estimates of the distribution tails than the adaptive kernel estimator. A one-dimensional application to the estimation of a log-normal distribution is included as an example. |
---|---|
DOI: | 10.1109/IJCNN.1993.717050 |