Distributed Deep Convolutional Compression for Massive MIMO CSI Feedback
Massive multiple-input multiple-output (MIMO) systems require downlink channel state information (CSI) at the base station (BS) to achieve spatial diversity and multiplexing gains. In a frequency division duplex (FDD) multiuser massive MIMO network, each user needs to compress and feedback its downl...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on wireless communications 2021-04, Vol.20 (4), p.2621-2633 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 2633 |
---|---|
container_issue | 4 |
container_start_page | 2621 |
container_title | IEEE transactions on wireless communications |
container_volume | 20 |
creator | Mashhadi, Mahdi Boloursaz Yang, Qianqian Gunduz, Deniz |
description | Massive multiple-input multiple-output (MIMO) systems require downlink channel state information (CSI) at the base station (BS) to achieve spatial diversity and multiplexing gains. In a frequency division duplex (FDD) multiuser massive MIMO network, each user needs to compress and feedback its downlink CSI to the BS. The CSI overhead scales with the number of antennas, users and subcarriers, and becomes a major bottleneck for the overall spectral efficiency. In this paper, we propose a deep learning (DL)-based CSI compression scheme, called DeepCMC , composed of convolutional layers followed by quantization and entropy coding blocks. In comparison with previous DL-based CSI reduction structures, DeepCMC proposes a novel fully-convolutional neural network (NN) architecture, with residual layers at the decoder, and incorporates quantization and entropy coding blocks into its design. DeepCMC is trained to minimize a weighted rate-distortion cost, which enables a trade-off between the CSI quality and its feedback overhead. Simulation results demonstrate that DeepCMC outperforms the state of the art CSI compression schemes in terms of the reconstruction quality of CSI for the same compression rate. We also propose a distributed version of DeepCMC for a multi-user MIMO scenario to encode and reconstruct the CSI from multiple users in a distributed manner. Distributed DeepCMC not only utilizes the inherent CSI structures of a single MIMO user for compression, but also benefits from the correlations among the channel matrices of nearby users to further improve the performance in comparison with DeepCMC. We also propose a reduced-complexity training method for distributed DeepCMC, allowing to scale it to multiple users, and suggest a cluster-based distributed DeepCMC approach for practical implementation. |
doi_str_mv | 10.1109/TWC.2020.3043502 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TWC_2020_3043502</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9296555</ieee_id><sourcerecordid>2512221090</sourcerecordid><originalsourceid>FETCH-LOGICAL-c399t-97fb0449917e389012809f4985404acf05d73a0f64bb6fc081a0fb9a8e550fe83</originalsourceid><addsrcrecordid>eNo9kM1Lw0AUxBdRsFbvgpeA58S3X8nuUVJrCy09WPG4bNK3kNo2cTcp-N-7pcXTm4GZ4fEj5JFCRinol_VXmTFgkHEQXAK7IiMqpUoZE-r6pHmeUlbkt-QuhC0ALXIpR2Q2aULvm2rocZNMELukbA_Hdjf0TXuwu-j2nccQoktc65OljfqIyXK-XCXlxzyZIm4qW3_fkxtndwEfLndMPqdv63KWLlbv8_J1kdZc6z7VhatACK1pgVxpoEyBdkIrKUDY2oHcFNyCy0VV5a4GRaOptFUoJThUfEyez7udb38GDL3ZtoOPrwbDJGWMRRgQU3BO1b4NwaMznW_21v8aCubEy0Re5sTLXHjFytO50iDif1wzHTlJ_gdwgmSa</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2512221090</pqid></control><display><type>article</type><title>Distributed Deep Convolutional Compression for Massive MIMO CSI Feedback</title><source>IEEE Electronic Library (IEL)</source><creator>Mashhadi, Mahdi Boloursaz ; Yang, Qianqian ; Gunduz, Deniz</creator><creatorcontrib>Mashhadi, Mahdi Boloursaz ; Yang, Qianqian ; Gunduz, Deniz</creatorcontrib><description>Massive multiple-input multiple-output (MIMO) systems require downlink channel state information (CSI) at the base station (BS) to achieve spatial diversity and multiplexing gains. In a frequency division duplex (FDD) multiuser massive MIMO network, each user needs to compress and feedback its downlink CSI to the BS. The CSI overhead scales with the number of antennas, users and subcarriers, and becomes a major bottleneck for the overall spectral efficiency. In this paper, we propose a deep learning (DL)-based CSI compression scheme, called DeepCMC , composed of convolutional layers followed by quantization and entropy coding blocks. In comparison with previous DL-based CSI reduction structures, DeepCMC proposes a novel fully-convolutional neural network (NN) architecture, with residual layers at the decoder, and incorporates quantization and entropy coding blocks into its design. DeepCMC is trained to minimize a weighted rate-distortion cost, which enables a trade-off between the CSI quality and its feedback overhead. Simulation results demonstrate that DeepCMC outperforms the state of the art CSI compression schemes in terms of the reconstruction quality of CSI for the same compression rate. We also propose a distributed version of DeepCMC for a multi-user MIMO scenario to encode and reconstruct the CSI from multiple users in a distributed manner. Distributed DeepCMC not only utilizes the inherent CSI structures of a single MIMO user for compression, but also benefits from the correlations among the channel matrices of nearby users to further improve the performance in comparison with DeepCMC. We also propose a reduced-complexity training method for distributed DeepCMC, allowing to scale it to multiple users, and suggest a cluster-based distributed DeepCMC approach for practical implementation.</description><identifier>ISSN: 1536-1276</identifier><identifier>EISSN: 1558-2248</identifier><identifier>DOI: 10.1109/TWC.2020.3043502</identifier><identifier>CODEN: ITWCAX</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Artificial neural networks ; Codes ; Coding ; Convolutional codes ; Correlation ; Downlink ; Downlinking ; Entropy ; Feedback ; Frequency division duplexing ; machine learning ; Massive MIMO ; Measurement ; MIMO (control systems) ; Multiple-input multiple-output (MIMO) ; Multiplexing ; Neural networks ; Quantization (signal) ; Spectral efficiency ; Subcarriers ; Training ; wireless communication</subject><ispartof>IEEE transactions on wireless communications, 2021-04, Vol.20 (4), p.2621-2633</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c399t-97fb0449917e389012809f4985404acf05d73a0f64bb6fc081a0fb9a8e550fe83</citedby><cites>FETCH-LOGICAL-c399t-97fb0449917e389012809f4985404acf05d73a0f64bb6fc081a0fb9a8e550fe83</cites><orcidid>0000-0002-7725-395X ; 0000-0001-9948-9165 ; 0000-0003-4747-9410</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9296555$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27903,27904,54737</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9296555$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Mashhadi, Mahdi Boloursaz</creatorcontrib><creatorcontrib>Yang, Qianqian</creatorcontrib><creatorcontrib>Gunduz, Deniz</creatorcontrib><title>Distributed Deep Convolutional Compression for Massive MIMO CSI Feedback</title><title>IEEE transactions on wireless communications</title><addtitle>TWC</addtitle><description>Massive multiple-input multiple-output (MIMO) systems require downlink channel state information (CSI) at the base station (BS) to achieve spatial diversity and multiplexing gains. In a frequency division duplex (FDD) multiuser massive MIMO network, each user needs to compress and feedback its downlink CSI to the BS. The CSI overhead scales with the number of antennas, users and subcarriers, and becomes a major bottleneck for the overall spectral efficiency. In this paper, we propose a deep learning (DL)-based CSI compression scheme, called DeepCMC , composed of convolutional layers followed by quantization and entropy coding blocks. In comparison with previous DL-based CSI reduction structures, DeepCMC proposes a novel fully-convolutional neural network (NN) architecture, with residual layers at the decoder, and incorporates quantization and entropy coding blocks into its design. DeepCMC is trained to minimize a weighted rate-distortion cost, which enables a trade-off between the CSI quality and its feedback overhead. Simulation results demonstrate that DeepCMC outperforms the state of the art CSI compression schemes in terms of the reconstruction quality of CSI for the same compression rate. We also propose a distributed version of DeepCMC for a multi-user MIMO scenario to encode and reconstruct the CSI from multiple users in a distributed manner. Distributed DeepCMC not only utilizes the inherent CSI structures of a single MIMO user for compression, but also benefits from the correlations among the channel matrices of nearby users to further improve the performance in comparison with DeepCMC. We also propose a reduced-complexity training method for distributed DeepCMC, allowing to scale it to multiple users, and suggest a cluster-based distributed DeepCMC approach for practical implementation.</description><subject>Artificial neural networks</subject><subject>Codes</subject><subject>Coding</subject><subject>Convolutional codes</subject><subject>Correlation</subject><subject>Downlink</subject><subject>Downlinking</subject><subject>Entropy</subject><subject>Feedback</subject><subject>Frequency division duplexing</subject><subject>machine learning</subject><subject>Massive MIMO</subject><subject>Measurement</subject><subject>MIMO (control systems)</subject><subject>Multiple-input multiple-output (MIMO)</subject><subject>Multiplexing</subject><subject>Neural networks</subject><subject>Quantization (signal)</subject><subject>Spectral efficiency</subject><subject>Subcarriers</subject><subject>Training</subject><subject>wireless communication</subject><issn>1536-1276</issn><issn>1558-2248</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kM1Lw0AUxBdRsFbvgpeA58S3X8nuUVJrCy09WPG4bNK3kNo2cTcp-N-7pcXTm4GZ4fEj5JFCRinol_VXmTFgkHEQXAK7IiMqpUoZE-r6pHmeUlbkt-QuhC0ALXIpR2Q2aULvm2rocZNMELukbA_Hdjf0TXuwu-j2nccQoktc65OljfqIyXK-XCXlxzyZIm4qW3_fkxtndwEfLndMPqdv63KWLlbv8_J1kdZc6z7VhatACK1pgVxpoEyBdkIrKUDY2oHcFNyCy0VV5a4GRaOptFUoJThUfEyez7udb38GDL3ZtoOPrwbDJGWMRRgQU3BO1b4NwaMznW_21v8aCubEy0Re5sTLXHjFytO50iDif1wzHTlJ_gdwgmSa</recordid><startdate>202104</startdate><enddate>202104</enddate><creator>Mashhadi, Mahdi Boloursaz</creator><creator>Yang, Qianqian</creator><creator>Gunduz, Deniz</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-7725-395X</orcidid><orcidid>https://orcid.org/0000-0001-9948-9165</orcidid><orcidid>https://orcid.org/0000-0003-4747-9410</orcidid></search><sort><creationdate>202104</creationdate><title>Distributed Deep Convolutional Compression for Massive MIMO CSI Feedback</title><author>Mashhadi, Mahdi Boloursaz ; Yang, Qianqian ; Gunduz, Deniz</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c399t-97fb0449917e389012809f4985404acf05d73a0f64bb6fc081a0fb9a8e550fe83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Artificial neural networks</topic><topic>Codes</topic><topic>Coding</topic><topic>Convolutional codes</topic><topic>Correlation</topic><topic>Downlink</topic><topic>Downlinking</topic><topic>Entropy</topic><topic>Feedback</topic><topic>Frequency division duplexing</topic><topic>machine learning</topic><topic>Massive MIMO</topic><topic>Measurement</topic><topic>MIMO (control systems)</topic><topic>Multiple-input multiple-output (MIMO)</topic><topic>Multiplexing</topic><topic>Neural networks</topic><topic>Quantization (signal)</topic><topic>Spectral efficiency</topic><topic>Subcarriers</topic><topic>Training</topic><topic>wireless communication</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mashhadi, Mahdi Boloursaz</creatorcontrib><creatorcontrib>Yang, Qianqian</creatorcontrib><creatorcontrib>Gunduz, Deniz</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on wireless communications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Mashhadi, Mahdi Boloursaz</au><au>Yang, Qianqian</au><au>Gunduz, Deniz</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Distributed Deep Convolutional Compression for Massive MIMO CSI Feedback</atitle><jtitle>IEEE transactions on wireless communications</jtitle><stitle>TWC</stitle><date>2021-04</date><risdate>2021</risdate><volume>20</volume><issue>4</issue><spage>2621</spage><epage>2633</epage><pages>2621-2633</pages><issn>1536-1276</issn><eissn>1558-2248</eissn><coden>ITWCAX</coden><abstract>Massive multiple-input multiple-output (MIMO) systems require downlink channel state information (CSI) at the base station (BS) to achieve spatial diversity and multiplexing gains. In a frequency division duplex (FDD) multiuser massive MIMO network, each user needs to compress and feedback its downlink CSI to the BS. The CSI overhead scales with the number of antennas, users and subcarriers, and becomes a major bottleneck for the overall spectral efficiency. In this paper, we propose a deep learning (DL)-based CSI compression scheme, called DeepCMC , composed of convolutional layers followed by quantization and entropy coding blocks. In comparison with previous DL-based CSI reduction structures, DeepCMC proposes a novel fully-convolutional neural network (NN) architecture, with residual layers at the decoder, and incorporates quantization and entropy coding blocks into its design. DeepCMC is trained to minimize a weighted rate-distortion cost, which enables a trade-off between the CSI quality and its feedback overhead. Simulation results demonstrate that DeepCMC outperforms the state of the art CSI compression schemes in terms of the reconstruction quality of CSI for the same compression rate. We also propose a distributed version of DeepCMC for a multi-user MIMO scenario to encode and reconstruct the CSI from multiple users in a distributed manner. Distributed DeepCMC not only utilizes the inherent CSI structures of a single MIMO user for compression, but also benefits from the correlations among the channel matrices of nearby users to further improve the performance in comparison with DeepCMC. We also propose a reduced-complexity training method for distributed DeepCMC, allowing to scale it to multiple users, and suggest a cluster-based distributed DeepCMC approach for practical implementation.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TWC.2020.3043502</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-7725-395X</orcidid><orcidid>https://orcid.org/0000-0001-9948-9165</orcidid><orcidid>https://orcid.org/0000-0003-4747-9410</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1536-1276 |
ispartof | IEEE transactions on wireless communications, 2021-04, Vol.20 (4), p.2621-2633 |
issn | 1536-1276 1558-2248 |
language | eng |
recordid | cdi_crossref_primary_10_1109_TWC_2020_3043502 |
source | IEEE Electronic Library (IEL) |
subjects | Artificial neural networks Codes Coding Convolutional codes Correlation Downlink Downlinking Entropy Feedback Frequency division duplexing machine learning Massive MIMO Measurement MIMO (control systems) Multiple-input multiple-output (MIMO) Multiplexing Neural networks Quantization (signal) Spectral efficiency Subcarriers Training wireless communication |
title | Distributed Deep Convolutional Compression for Massive MIMO CSI Feedback |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T20%3A22%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Distributed%20Deep%20Convolutional%20Compression%20for%20Massive%20MIMO%20CSI%20Feedback&rft.jtitle=IEEE%20transactions%20on%20wireless%20communications&rft.au=Mashhadi,%20Mahdi%20Boloursaz&rft.date=2021-04&rft.volume=20&rft.issue=4&rft.spage=2621&rft.epage=2633&rft.pages=2621-2633&rft.issn=1536-1276&rft.eissn=1558-2248&rft.coden=ITWCAX&rft_id=info:doi/10.1109/TWC.2020.3043502&rft_dat=%3Cproquest_RIE%3E2512221090%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2512221090&rft_id=info:pmid/&rft_ieee_id=9296555&rfr_iscdi=true |