Numerical Estimation of Information Theoretic Measures for Large Data Sets

A problem that has plagued the tracking community for decades has been the lack of a single metric to assess the overall performance of tracking systems. The authors' prior research has identified total conditional entropy as a useful measure to assess the overall performance of multi-target tr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Hurley, Michael B, Kao, Edward K
Format: Report
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A problem that has plagued the tracking community for decades has been the lack of a single metric to assess the overall performance of tracking systems. The authors' prior research has identified total conditional entropy as a useful measure to assess the overall performance of multi-target trackers and classifiers. The measure can be used to evaluate any system that can be formulated as an assignment algorithm that maps N classes of objects to M labels. The assignments from the decision system are compared to a truth data set to generate the total conditional entropy. This report focuses on work to generate error estimates so that the statistical significance of test results can be determined. Derivations of Wolpert and Wolf provide exact equations for calculating the first- and second-order statistics for the three fundamental entropy measures from sample data. The authors have restructured Wolpert and Wolf's equations into computer code that produces stable numerical results up to sample sizes in the billions. This code is provided as a MATLAB software package available from MIT Lincoln Laboratory.