On the Jensen-Shannon divergence and variational distance

We study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen-Shannon divergence, and the well known L/sub 1/ metric. We show that several important results and constructions in computational complexity under the L/sub 1/ me...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2005-09, Vol.51 (9), p.3333-3336
Hauptverfasser: TSAI, Shi-Chun, TZENG, Wen-Guey, WU, Hsin-Lung
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen-Shannon divergence, and the well known L/sub 1/ metric. We show that several important results and constructions in computational complexity under the L/sub 1/ metric carry over to the new metric, such as Yao's next-bit predictor, the existence of extractors, the leftover hash lemma, and the construction of expander graph based extractor. Finally, we show that the useful parity lemma in studying pseudorandomness does not hold in the new metric.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2005.853308