A Training-Based Mutual Information Lower Bound for Large-Scale Systems

We provide a mutual information lower bound that can be used to analyze the effect of training in models with unknown parameters. For large-scale systems, we show that this bound can be calculated using the difference between two derivatives of a conditional entropy function. We provide a step-by-st...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on communications 2022-08, Vol.70 (8), p.5151-5163
Hauptverfasser: Gao, Kang, Meng, Xiangbo, Laneman, J. Nicholas, Chisum, Jonathan D., Bendlin, Ralf, Chopra, Aditya, Hochwald, Bertrand M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We provide a mutual information lower bound that can be used to analyze the effect of training in models with unknown parameters. For large-scale systems, we show that this bound can be calculated using the difference between two derivatives of a conditional entropy function. We provide a step-by-step process for computing the bound, and apply the steps to a quantized large-scale multiple-antenna wireless communication system with an unknown channel. Numerical results demonstrate the interplay between quantization and training.
ISSN:0090-6778
1558-0857
DOI:10.1109/TCOMM.2022.3182747