Two Measures of Dependence
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many prope...
Gespeichert in:
Veröffentlicht in: | Entropy (Basel, Switzerland) Switzerland), 2019-08, Vol.21 (8), p.778 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order
α
and the relative
α
-entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order
α
is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding. |
---|---|
ISSN: | 1099-4300 1099-4300 |
DOI: | 10.3390/e21080778 |