On Optimality of Gamma Approximation for Lognormal Shadowing Models
In this letter, we study the information-theoretic optimality of the gamma approximation for lognormal shadowing models in order to provide a rigorous mathematical ground for this useful technique. Specifically, we adopt the Kullback-Leibler (KL) divergence as the metric quantifying the distance bet...
Gespeichert in:
Veröffentlicht in: | IEEE antennas and wireless propagation letters 2023-05, Vol.22 (5), p.1-5 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this letter, we study the information-theoretic optimality of the gamma approximation for lognormal shadowing models in order to provide a rigorous mathematical ground for this useful technique. Specifically, we adopt the Kullback-Leibler (KL) divergence as the metric quantifying the distance between the original lognormal and the approximate gamma distributions. The KL divergence resulted from the moment matching criterion, the minimum achievable KL divergence of the gamma approximation, and the statistical parameter mapping relations are all derived in closed form. By these closed-form analytical expressions, we are able to rigorously examine the utility and optimality of the gamma approximation with a benchmark. Comparing the closed-form expressions of the KL divergence by moment matching and the minimum achievable benchmark, we find that the moment matching criterion, as a heuristic method, cannot guarantee the information-theoretic optimality. We also present and discuss the relevant results to substantiate the information-theoretic optimality achieved by our proposed statistical parameter mapping relations and the corresponding analytical insights. |
---|---|
ISSN: | 1536-1225 1548-5757 |
DOI: | 10.1109/LAWP.2022.3233522 |