A Soft-GJETP Based Blind Identification Algorithm for Convolutional Encoder Parameters
In an adaptive or non-cooperative communication system, the blind identification of channel coding is an indispensable procedure for recovering the message from the intercepted coding data. Due to the wide applications of convolutional codes, the blind identification problem of convolutional codes h...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on communications 2022-12, Vol.70 (12), p.1-1 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In an adaptive or non-cooperative communication system, the blind identification of channel coding is an indispensable procedure for recovering the message from the intercepted coding data. Due to the wide applications of convolutional codes, the blind identification problem of convolutional codes has also received extensive studies. In this paper, we consider the blind identification problem of convolution encoder parameters in the general k/n rate case. To improve the blind identification accuracy of existing hard decision based solutions, a novel soft information based blind identification algorithm is designed in this paper. Specifically, a Soft Gaussian-Jordan Elimination Through Pivoting (Soft-GJETP) algorithm is firstly proposed to calculate the rank of the received data matrix. In contrast to the existing hard decision based GJETP algorithm, this Soft-GJETP algorithm formulates the soft information of the received data, then defines three reliability metrics and derives updating rules for this soft information. In this way, the diagonals of the received data matrix have fewer errors in Soft-GJETP algorithm. Furthermore, a soft decision strategy is proposed with a weight-dependent threshold. Employing this strategy, the dependent columns can be distinguished with higher accuracy. Finally, simulation results and comparisons are given to illustrate the performances of our proposed methods. |
---|---|
ISSN: | 0090-6778 1558-0857 |
DOI: | 10.1109/TCOMM.2022.3215994 |