Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes

We consider sparse superposition codes (SPARCs) over complex AWGN channels. Such codes can be efficiently decoded by an approximate message passing (AMP) decoder, whose performance can be predicted via so-called state evolution in the large-system limit. In this paper, we mainly focus on how to use...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Cao, Haiwen, Vontobel, Pascal O
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Cao, Haiwen
Vontobel, Pascal O
description We consider sparse superposition codes (SPARCs) over complex AWGN channels. Such codes can be efficiently decoded by an approximate message passing (AMP) decoder, whose performance can be predicted via so-called state evolution in the large-system limit. In this paper, we mainly focus on how to use concatenation of SPARCs and cyclic redundancy check (CRC) codes on the encoding side and use list decoding on the decoding side to improve the finite-length performance of the AMP decoder for SPARCs over complex AWGN channels. Simulation results show that such a concatenated coding scheme works much better than SPARCs with the original AMP decoder and results in a steep waterfall-like behavior in the bit-error rate performance curves. Furthermore, we apply our proposed concatenated coding scheme to spatially coupled SPARCs. Besides that, we also introduce a novel class of design matrices, i.e., matrices that describe the encoding process, based on circulant matrices derived from Frank or from Milewski sequences. This class of design matrices has comparable encoding and decoding computational complexity as well as very close performance with the commonly-used class of design matrices based on discrete Fourier transform (DFT) matrices, but gives us more degrees of freedom when designing SPARCs for various applications.
doi_str_mv 10.48550/arxiv.2011.00224
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2011_00224</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2011_00224</sourcerecordid><originalsourceid>FETCH-LOGICAL-a674-b6075270abd0164ebcd6d63b70bf888b1f13eb788c262a6d196a156bf3ac3afd3</originalsourceid><addsrcrecordid>eNotz09LwzAcxvFcPMj0BXgyb6A1f9o0HqU6HRQmurFj-aX5pQvYpiRh6LuXTU8P38sDH0LuOCsrXdfsAeK3P5WCcV4yJkR1TQ775OeRdj5l-oxDsOfKgW6mJYYT0nxEuvazz1h0OI_5SN8xuhAnmAekwdHPBWJC-oFjxJR8mGkbLKYbcuXgK-Ht_67Ibv2ya9-Kbvu6aZ-6AlRTFUaxphYNA2MZVxWawSqrpGmYcVprwx2XaBqtB6EEKMsfFfBaGSdhkOCsXJH7v9uLrF-inyD-9GdhfxHKX1nYTBE</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes</title><source>arXiv.org</source><creator>Cao, Haiwen ; Vontobel, Pascal O</creator><creatorcontrib>Cao, Haiwen ; Vontobel, Pascal O</creatorcontrib><description>We consider sparse superposition codes (SPARCs) over complex AWGN channels. Such codes can be efficiently decoded by an approximate message passing (AMP) decoder, whose performance can be predicted via so-called state evolution in the large-system limit. In this paper, we mainly focus on how to use concatenation of SPARCs and cyclic redundancy check (CRC) codes on the encoding side and use list decoding on the decoding side to improve the finite-length performance of the AMP decoder for SPARCs over complex AWGN channels. Simulation results show that such a concatenated coding scheme works much better than SPARCs with the original AMP decoder and results in a steep waterfall-like behavior in the bit-error rate performance curves. Furthermore, we apply our proposed concatenated coding scheme to spatially coupled SPARCs. Besides that, we also introduce a novel class of design matrices, i.e., matrices that describe the encoding process, based on circulant matrices derived from Frank or from Milewski sequences. This class of design matrices has comparable encoding and decoding computational complexity as well as very close performance with the commonly-used class of design matrices based on discrete Fourier transform (DFT) matrices, but gives us more degrees of freedom when designing SPARCs for various applications.</description><identifier>DOI: 10.48550/arxiv.2011.00224</identifier><language>eng</language><subject>Computer Science - Information Theory ; Mathematics - Information Theory</subject><creationdate>2020-10</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2011.00224$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2011.00224$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Cao, Haiwen</creatorcontrib><creatorcontrib>Vontobel, Pascal O</creatorcontrib><title>Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes</title><description>We consider sparse superposition codes (SPARCs) over complex AWGN channels. Such codes can be efficiently decoded by an approximate message passing (AMP) decoder, whose performance can be predicted via so-called state evolution in the large-system limit. In this paper, we mainly focus on how to use concatenation of SPARCs and cyclic redundancy check (CRC) codes on the encoding side and use list decoding on the decoding side to improve the finite-length performance of the AMP decoder for SPARCs over complex AWGN channels. Simulation results show that such a concatenated coding scheme works much better than SPARCs with the original AMP decoder and results in a steep waterfall-like behavior in the bit-error rate performance curves. Furthermore, we apply our proposed concatenated coding scheme to spatially coupled SPARCs. Besides that, we also introduce a novel class of design matrices, i.e., matrices that describe the encoding process, based on circulant matrices derived from Frank or from Milewski sequences. This class of design matrices has comparable encoding and decoding computational complexity as well as very close performance with the commonly-used class of design matrices based on discrete Fourier transform (DFT) matrices, but gives us more degrees of freedom when designing SPARCs for various applications.</description><subject>Computer Science - Information Theory</subject><subject>Mathematics - Information Theory</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz09LwzAcxvFcPMj0BXgyb6A1f9o0HqU6HRQmurFj-aX5pQvYpiRh6LuXTU8P38sDH0LuOCsrXdfsAeK3P5WCcV4yJkR1TQ775OeRdj5l-oxDsOfKgW6mJYYT0nxEuvazz1h0OI_5SN8xuhAnmAekwdHPBWJC-oFjxJR8mGkbLKYbcuXgK-Ht_67Ibv2ya9-Kbvu6aZ-6AlRTFUaxphYNA2MZVxWawSqrpGmYcVprwx2XaBqtB6EEKMsfFfBaGSdhkOCsXJH7v9uLrF-inyD-9GdhfxHKX1nYTBE</recordid><startdate>20201031</startdate><enddate>20201031</enddate><creator>Cao, Haiwen</creator><creator>Vontobel, Pascal O</creator><scope>AKY</scope><scope>AKZ</scope><scope>GOX</scope></search><sort><creationdate>20201031</creationdate><title>Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes</title><author>Cao, Haiwen ; Vontobel, Pascal O</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a674-b6075270abd0164ebcd6d63b70bf888b1f13eb788c262a6d196a156bf3ac3afd3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Computer Science - Information Theory</topic><topic>Mathematics - Information Theory</topic><toplevel>online_resources</toplevel><creatorcontrib>Cao, Haiwen</creatorcontrib><creatorcontrib>Vontobel, Pascal O</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Cao, Haiwen</au><au>Vontobel, Pascal O</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes</atitle><date>2020-10-31</date><risdate>2020</risdate><abstract>We consider sparse superposition codes (SPARCs) over complex AWGN channels. Such codes can be efficiently decoded by an approximate message passing (AMP) decoder, whose performance can be predicted via so-called state evolution in the large-system limit. In this paper, we mainly focus on how to use concatenation of SPARCs and cyclic redundancy check (CRC) codes on the encoding side and use list decoding on the decoding side to improve the finite-length performance of the AMP decoder for SPARCs over complex AWGN channels. Simulation results show that such a concatenated coding scheme works much better than SPARCs with the original AMP decoder and results in a steep waterfall-like behavior in the bit-error rate performance curves. Furthermore, we apply our proposed concatenated coding scheme to spatially coupled SPARCs. Besides that, we also introduce a novel class of design matrices, i.e., matrices that describe the encoding process, based on circulant matrices derived from Frank or from Milewski sequences. This class of design matrices has comparable encoding and decoding computational complexity as well as very close performance with the commonly-used class of design matrices based on discrete Fourier transform (DFT) matrices, but gives us more degrees of freedom when designing SPARCs for various applications.</abstract><doi>10.48550/arxiv.2011.00224</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2011.00224
ispartof
issn
language eng
recordid cdi_arxiv_primary_2011_00224
source arXiv.org
subjects Computer Science - Information Theory
Mathematics - Information Theory
title Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T13%3A54%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Using%20List%20Decoding%20to%20Improve%20the%20Finite-Length%20Performance%20of%20Sparse%20Regression%20Codes&rft.au=Cao,%20Haiwen&rft.date=2020-10-31&rft_id=info:doi/10.48550/arxiv.2011.00224&rft_dat=%3Carxiv_GOX%3E2011_00224%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true