Neural Communication Systems with Bandwidth-limited Channel

Reliably transmitting messages despite information loss due to a noisy channel is a core problem of information theory. One of the most important aspects of real world communication, e.g. via wifi, is that it may happen at varying levels of information transfer. The bandwidth-limited channel models...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Ullrich, Karen, Viola, Fabio, Rezende, Danilo Jimenez
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Ullrich, Karen
Viola, Fabio
Rezende, Danilo Jimenez
description Reliably transmitting messages despite information loss due to a noisy channel is a core problem of information theory. One of the most important aspects of real world communication, e.g. via wifi, is that it may happen at varying levels of information transfer. The bandwidth-limited channel models this phenomenon. In this study we consider learning coding with the bandwidth-limited channel (BWLC). Recently, neural communication models such as variational autoencoders have been studied for the task of source compression. We build upon this work by studying neural communication systems with the BWLC. Specifically,we find three modelling choices that are relevant under expected information loss. First, instead of separating the sub-tasks of compression (source coding) and error correction (channel coding), we propose to model both jointly. Framing the problem as a variational learning problem, we conclude that joint systems outperform their separate counterparts when coding is performed by flexible learnable function approximators such as neural networks. To facilitate learning, we introduce a differentiable and computationally efficient version of the bandwidth-limited channel. Second, we propose a design to model missing information with a prior, and incorporate this into the channel model. Finally, sampling from the joint model is improved by introducing auxiliary latent variables in the decoder. Experimental results justify the validity of our design decisions through improved distortion and FID scores.
doi_str_mv 10.48550/arxiv.2003.13367
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2003_13367</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2003_13367</sourcerecordid><originalsourceid>FETCH-LOGICAL-a677-424e4c841b0c1ede0c0090af1ba6da12c0daf3940df1bcb1cd7c28f269fa95493</originalsourceid><addsrcrecordid>eNotj7tuwjAYRr10QLQPwIRfIOH3JReLqUQtrYToUPbojy-KpdhUiSnl7Usp05G-4eg7hCwY5LIuCljh-OO_cw4gciZEWc3Iem9PIw60OYZwil5j8sdIPy9TsmGiZ596usFozt6kPht88Mka2vQYox0eyYPDYbJPd87J4fXl0Lxlu4_te_O8y7CsqkxyaaWuJetAM2ssaAAF6FiHpUHGNRh0Qkkw10l3TJtK89rxUjlUhVRiTpb_2tv79mv0AcdL-1fR3irEL-mzQvo</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Neural Communication Systems with Bandwidth-limited Channel</title><source>arXiv.org</source><creator>Ullrich, Karen ; Viola, Fabio ; Rezende, Danilo Jimenez</creator><creatorcontrib>Ullrich, Karen ; Viola, Fabio ; Rezende, Danilo Jimenez</creatorcontrib><description>Reliably transmitting messages despite information loss due to a noisy channel is a core problem of information theory. One of the most important aspects of real world communication, e.g. via wifi, is that it may happen at varying levels of information transfer. The bandwidth-limited channel models this phenomenon. In this study we consider learning coding with the bandwidth-limited channel (BWLC). Recently, neural communication models such as variational autoencoders have been studied for the task of source compression. We build upon this work by studying neural communication systems with the BWLC. Specifically,we find three modelling choices that are relevant under expected information loss. First, instead of separating the sub-tasks of compression (source coding) and error correction (channel coding), we propose to model both jointly. Framing the problem as a variational learning problem, we conclude that joint systems outperform their separate counterparts when coding is performed by flexible learnable function approximators such as neural networks. To facilitate learning, we introduce a differentiable and computationally efficient version of the bandwidth-limited channel. Second, we propose a design to model missing information with a prior, and incorporate this into the channel model. Finally, sampling from the joint model is improved by introducing auxiliary latent variables in the decoder. Experimental results justify the validity of our design decisions through improved distortion and FID scores.</description><identifier>DOI: 10.48550/arxiv.2003.13367</identifier><language>eng</language><subject>Computer Science - Information Theory ; Computer Science - Learning ; Mathematics - Information Theory ; Statistics - Machine Learning</subject><creationdate>2020-03</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2003.13367$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2003.13367$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Ullrich, Karen</creatorcontrib><creatorcontrib>Viola, Fabio</creatorcontrib><creatorcontrib>Rezende, Danilo Jimenez</creatorcontrib><title>Neural Communication Systems with Bandwidth-limited Channel</title><description>Reliably transmitting messages despite information loss due to a noisy channel is a core problem of information theory. One of the most important aspects of real world communication, e.g. via wifi, is that it may happen at varying levels of information transfer. The bandwidth-limited channel models this phenomenon. In this study we consider learning coding with the bandwidth-limited channel (BWLC). Recently, neural communication models such as variational autoencoders have been studied for the task of source compression. We build upon this work by studying neural communication systems with the BWLC. Specifically,we find three modelling choices that are relevant under expected information loss. First, instead of separating the sub-tasks of compression (source coding) and error correction (channel coding), we propose to model both jointly. Framing the problem as a variational learning problem, we conclude that joint systems outperform their separate counterparts when coding is performed by flexible learnable function approximators such as neural networks. To facilitate learning, we introduce a differentiable and computationally efficient version of the bandwidth-limited channel. Second, we propose a design to model missing information with a prior, and incorporate this into the channel model. Finally, sampling from the joint model is improved by introducing auxiliary latent variables in the decoder. Experimental results justify the validity of our design decisions through improved distortion and FID scores.</description><subject>Computer Science - Information Theory</subject><subject>Computer Science - Learning</subject><subject>Mathematics - Information Theory</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj7tuwjAYRr10QLQPwIRfIOH3JReLqUQtrYToUPbojy-KpdhUiSnl7Usp05G-4eg7hCwY5LIuCljh-OO_cw4gciZEWc3Iem9PIw60OYZwil5j8sdIPy9TsmGiZ596usFozt6kPht88Mka2vQYox0eyYPDYbJPd87J4fXl0Lxlu4_te_O8y7CsqkxyaaWuJetAM2ssaAAF6FiHpUHGNRh0Qkkw10l3TJtK89rxUjlUhVRiTpb_2tv79mv0AcdL-1fR3irEL-mzQvo</recordid><startdate>20200330</startdate><enddate>20200330</enddate><creator>Ullrich, Karen</creator><creator>Viola, Fabio</creator><creator>Rezende, Danilo Jimenez</creator><scope>AKY</scope><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20200330</creationdate><title>Neural Communication Systems with Bandwidth-limited Channel</title><author>Ullrich, Karen ; Viola, Fabio ; Rezende, Danilo Jimenez</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a677-424e4c841b0c1ede0c0090af1ba6da12c0daf3940df1bcb1cd7c28f269fa95493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Computer Science - Information Theory</topic><topic>Computer Science - Learning</topic><topic>Mathematics - Information Theory</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Ullrich, Karen</creatorcontrib><creatorcontrib>Viola, Fabio</creatorcontrib><creatorcontrib>Rezende, Danilo Jimenez</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ullrich, Karen</au><au>Viola, Fabio</au><au>Rezende, Danilo Jimenez</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Neural Communication Systems with Bandwidth-limited Channel</atitle><date>2020-03-30</date><risdate>2020</risdate><abstract>Reliably transmitting messages despite information loss due to a noisy channel is a core problem of information theory. One of the most important aspects of real world communication, e.g. via wifi, is that it may happen at varying levels of information transfer. The bandwidth-limited channel models this phenomenon. In this study we consider learning coding with the bandwidth-limited channel (BWLC). Recently, neural communication models such as variational autoencoders have been studied for the task of source compression. We build upon this work by studying neural communication systems with the BWLC. Specifically,we find three modelling choices that are relevant under expected information loss. First, instead of separating the sub-tasks of compression (source coding) and error correction (channel coding), we propose to model both jointly. Framing the problem as a variational learning problem, we conclude that joint systems outperform their separate counterparts when coding is performed by flexible learnable function approximators such as neural networks. To facilitate learning, we introduce a differentiable and computationally efficient version of the bandwidth-limited channel. Second, we propose a design to model missing information with a prior, and incorporate this into the channel model. Finally, sampling from the joint model is improved by introducing auxiliary latent variables in the decoder. Experimental results justify the validity of our design decisions through improved distortion and FID scores.</abstract><doi>10.48550/arxiv.2003.13367</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2003.13367
ispartof
issn
language eng
recordid cdi_arxiv_primary_2003_13367
source arXiv.org
subjects Computer Science - Information Theory
Computer Science - Learning
Mathematics - Information Theory
Statistics - Machine Learning
title Neural Communication Systems with Bandwidth-limited Channel
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T16%3A43%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Neural%20Communication%20Systems%20with%20Bandwidth-limited%20Channel&rft.au=Ullrich,%20Karen&rft.date=2020-03-30&rft_id=info:doi/10.48550/arxiv.2003.13367&rft_dat=%3Carxiv_GOX%3E2003_13367%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true