Deep Learning-Based User Activity Detection and Channel Estimation in Grant-Free NOMA
In the uplink machine-type communication (MTC) system, a combination of grant-free transmission and non-orthogonal multiple access (NOMA) emerges to reduce the control overhead and transmission latency. In the grant-free scenario, the base station needs to identify the active devices and estimate th...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on wireless communications 2023-04, Vol.22 (4), p.2202-2214 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the uplink machine-type communication (MTC) system, a combination of grant-free transmission and non-orthogonal multiple access (NOMA) emerges to reduce the control overhead and transmission latency. In the grant-free scenario, the base station needs to identify the active devices and estimate the channel state information before the data detection. However, due to the lack of a scheduling process, the user activity detection (UAD) and channel estimation (CE) are both challenging, especially when short non-orthogonal preambles are adopted. In this paper, by exploiting the framework of the compressive sensing-based algorithm, we propose a novel deep learning architecture, namely UAD and CE Neural Network (UAD-CE-NN), to effectively solve the joint UAD and CE problem for grant-free NOMA. In the proposed scheme, the user activity and channel state information hidden in the received data signals are also exploited to aid the preamble for higher detection accuracy. Specifically, UAD-CE-NN is composed of two stages: we first build a preamble detection neural network for a tentative UAD-CE; a data detection neural network is then deployed to exploit the data signals. Compared with the conventional schemes, the proposed scheme obtains much higher accuracy for both the UAD and CE, especially when short preamble sequences are employed. |
---|---|
ISSN: | 1536-1276 1558-2248 |
DOI: | 10.1109/TWC.2022.3209667 |