CrossMPT: Cross-attention Message-Passing Transformer for Error Correcting Codes
Error correcting codes (ECCs) are indispensable for reliable transmission in communication systems. The recent advancements in deep learning have catalyzed the exploration of ECC decoders based on neural networks. Among these, transformer-based neural decoders have achieved state-of-the-art decoding...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Error correcting codes (ECCs) are indispensable for reliable transmission in
communication systems. The recent advancements in deep learning have catalyzed
the exploration of ECC decoders based on neural networks. Among these,
transformer-based neural decoders have achieved state-of-the-art decoding
performance. In this paper, we propose a novel Cross-attention Message-Passing
Transformer (CrossMPT), which shares key operational principles with
conventional message-passing decoders. While conventional transformer-based
decoders employ self-attention mechanism without distinguishing between the
types of input vectors (i.e., magnitude and syndrome vectors), CrossMPT updates
the two types of input vectors separately and iteratively using two masked
cross-attention blocks. The mask matrices are determined by the code's
parity-check matrix, which explicitly captures the irrelevant relationship
between two input vectors. Our experimental results show that CrossMPT
significantly outperforms existing neural network-based decoders for various
code classes. Notably, CrossMPT achieves this decoding performance improvement,
while significantly reducing the memory usage, complexity, inference time, and
training time. |
---|---|
DOI: | 10.48550/arxiv.2405.01033 |