Multisegment Mapping Network for Massive MIMO Detection

The massive multiple-input multiple-output (MIMO) technology is one of the core technologies of 5G, which can significantly improve spectral efficiency. Because of the large number of massive MIMO antennas, the computational complexity of detection has increased significantly, which poses a signific...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of antennas and propagation 2021-09, Vol.2021, p.1-7
Hauptverfasser: Yu, Yongzhi, Wang, Jianming, Guo, Limin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The massive multiple-input multiple-output (MIMO) technology is one of the core technologies of 5G, which can significantly improve spectral efficiency. Because of the large number of massive MIMO antennas, the computational complexity of detection has increased significantly, which poses a significant challenge to traditional detection algorithms. However, the use of deep learning for massive MIMO detection can achieve a high degree of computational parallelism, and deep learning constitutes an important technical approach for solving the signal detection problem. This paper proposes a deep neural network for massive MIMO detection, named Multisegment Mapping Network (MsNet). MsNet is obtained by optimizing the prior detection networks that are termed as DetNet and ScNet. MsNet further simplifies the sparse connection structure and reduces network complexity, which also changes the coefficients of the residual structure in the network into trainable variables. In addition, this paper designs an activation function to improve the performance of massive MIMO detection in high-order modulation scenarios. The simulation results show that MsNet has better symbol error rate (SER) performance and both computational complexity and the number of training parameters are significantly reduced.
ISSN:1687-5869
1687-5877
DOI:10.1155/2021/9989634