On the Estimation of Differential Entropy From Data Located on Embedded Manifolds

Estimation of the differential entropy from observations of a random variable is of great importance for a wide range of signal processing applications such as source coding, pattern recognition, hypothesis testing, and blind source separation. In this paper, we present a method for estimation of th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2007-07, Vol.53 (7), p.2330-2341
Hauptverfasser: Nilsson, M., Kleijn, W.B.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Estimation of the differential entropy from observations of a random variable is of great importance for a wide range of signal processing applications such as source coding, pattern recognition, hypothesis testing, and blind source separation. In this paper, we present a method for estimation of the Shannon differential entropy that accounts for embedded manifolds. The method is based on high-rate quantization theory and forms an extension of the classical nearest-neighbor entropy estimator. The estimator is consistent in the mean square sense and an upper bound on the rate of convergence of the estimator is given. Because of the close connection between compression and Shannon entropy, the proposed method has an advantage over methods estimating the Renyi entropy. Through experiments on uniformly distributed data on known manifolds and real-world speech data we show the accuracy and usefulness of our proposed method.
ISSN:0018-9448
1557-9654
1557-9654
DOI:10.1109/TIT.2007.899533