Factor Graph Optimization of Error-Correcting Codes for Belief Propagation Decoding
The design of optimal linear block codes capable of being efficiently decoded is of major concern, especially for short block lengths. As near capacity-approaching codes, Low-Density Parity-Check (LDPC) codes possess several advantages over other families of codes, the most notable being its efficie...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The design of optimal linear block codes capable of being efficiently decoded
is of major concern, especially for short block lengths. As near
capacity-approaching codes, Low-Density Parity-Check (LDPC) codes possess
several advantages over other families of codes, the most notable being its
efficient decoding via Belief Propagation. While many LDPC code design methods
exist, the development of efficient sparse codes that meet the constraints of
modern short code lengths and accommodate new channel models remains a
challenge. In this work, we propose for the first time a gradient-based
data-driven approach for the design of sparse codes. We develop locally optimal
codes with respect to Belief Propagation decoding via the learning of the
Factor graph under channel noise simulations. This is performed via a novel
complete graph tensor representation of the Belief Propagation algorithm,
optimized over finite fields via backpropagation and coupled with an efficient
line-search method. The proposed approach is shown to outperform the decoding
performance of existing popular codes by orders of magnitude and demonstrates
the power of data-driven approaches for code design. |
---|---|
DOI: | 10.48550/arxiv.2406.12900 |