Continuously Differentiable Sample-Spacing Entropy Estimation

The insufficiency of using only second-order statistics and premise of exploiting higher order statistics of the data has been well understood, and more advanced objectives including higher order statistics, especially those stemming from information theory, such as error entropy minimization, are n...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2008-11, Vol.19 (11), p.1978-1984
Hauptverfasser: Ozertem, U., Uysal, I., Erdogmus, D.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The insufficiency of using only second-order statistics and premise of exploiting higher order statistics of the data has been well understood, and more advanced objectives including higher order statistics, especially those stemming from information theory, such as error entropy minimization, are now being studied and applied in many contexts of machine learning and signal processing. In the adaptive system training context, the main drawback of utilizing output error entropy as compared to correlation-estimation-based second-order statistics is the computational load of the entropy estimation, which is usually obtained via a plug-in kernel estimator. Sample-spacing estimates offer computationally inexpensive entropy estimators; however, resulting estimates are not differentiable, hence, not suitable for gradient-based adaptation. In this brief paper, we propose a nonparametric entropy estimator that captures the desirable properties of both approaches. The resulting estimator yields continuously differentiable estimates with a computational complexity at the order of those of the sample-spacing techniques. The proposed estimator is compared with the kernel density estimation (KDE)-based entropy estimator in the supervised neural network training framework with computation time and performance comparisons.
ISSN:1045-9227
2162-237X
1941-0093
2162-2388
DOI:10.1109/TNN.2008.2006167