Bounds on information combining

When the same data sequence is transmitted over two independent channels, or when a data sequence is transmitted twice but independently over the same channel, the independent observations can be combined at the receiver side. From an information-theory point of view, the overall mutual information...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2005-02, Vol.51 (2), p.612-619
Hauptverfasser: Land, I., Huettinger, S., Hoeher, P.A., Huber, J.B.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:When the same data sequence is transmitted over two independent channels, or when a data sequence is transmitted twice but independently over the same channel, the independent observations can be combined at the receiver side. From an information-theory point of view, the overall mutual information between the data sequence and the received sequences represents a combination of the mutual information of the two channels. This concept is termed information combining. A lower bound and an upper bound on the combined information is presented, and it is proved that these bounds are tight. Furthermore, this principle is extended to the computation of extrinsic information on single code bits for a repetition code and for a single parity-check code of length three, respectively. For illustration of the concept and the bounds on information combining, two applications are considered. First, bounds on the information processing characteristic (IPC) of a parallel concatenated code are derived from its extrinsic information transfer (EXIT) chart. Second, bounds on the EXIT chart for an outer repetition code and for an outer single parity-check code of a serially concatenated coding scheme are computed.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2004.840883