A Poisson Decomposition for Information and the Information-Event Diagram
Information diagram and the I-measure are useful mnemonics where random variables are treated as sets, and entropy and mutual information are treated as a signed measure. Although the I-measure has been successful in machine proofs of entropy inequalities, the theoretical underpinning of the ``rando...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Information diagram and the I-measure are useful mnemonics where random
variables are treated as sets, and entropy and mutual information are treated
as a signed measure. Although the I-measure has been successful in machine
proofs of entropy inequalities, the theoretical underpinning of the ``random
variables as sets'' analogy has been unclear until the recent works on mappings
from random variables to sets by Ellerman (recovering order-$2$ Tsallis entropy
over general probability space), and Down and Mediano (recovering Shannon
entropy over discrete probability space). We generalize these constructions by
designing a mapping which recovers the Shannon entropy (and the information
density) over general probability space. Moreover, it has an intuitive
interpretation based on the arrival time in a Poisson process, allowing us to
understand the union, intersection and difference between (sets corresponding
to) random variables and events. Cross entropy, KL divergence, and conditional
entropy given an event, can be obtained as set intersections. We propose a
generalization of the information diagram that also includes events, and
demonstrate its usage by a diagrammatic proof of Fano's inequality. |
---|---|
DOI: | 10.48550/arxiv.2307.07506 |