Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof

Artstein, Ball, Barthe, and Naor have recently shown that the non-Gaussianness (divergence with respect to a Gaussian random variable with identical first and second moments) of the sum of independent and identically distributed (i.i.d.) random variables is monotonically nonincreasing. We give a sim...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2006-09, Vol.52 (9), p.4295-4297
Hauptverfasser: Tulino, A.M., Verdu, S.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 4297
container_issue 9
container_start_page 4295
container_title IEEE transactions on information theory
container_volume 52
creator Tulino, A.M.
Verdu, S.
description Artstein, Ball, Barthe, and Naor have recently shown that the non-Gaussianness (divergence with respect to a Gaussian random variable with identical first and second moments) of the sum of independent and identically distributed (i.i.d.) random variables is monotonically nonincreasing. We give a simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels. As Artstein , we also deal with the more general setting of nonidentically distributed random variables
doi_str_mv 10.1109/TIT.2006.880066
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_pascalfrancis_primary_18059422</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1683951</ieee_id><sourcerecordid>914654212</sourcerecordid><originalsourceid>FETCH-LOGICAL-c323t-fe1251957bf65872160467d45496a30296d9f326e90d40784b9f3ca31c5ebe4c3</originalsourceid><addsrcrecordid>eNpFkDtPwzAUhS0EEqUwM7B4QUxpbcd2YraqQKlUHqKFNXKcGxGU2MFOBv49iQrqcp_nnOFD6JKSGaVEzXfr3YwRImdpOlR5hCZUiCRSUvBjNCGEppHiPD1FZyF8DSsXlE0QPDnrOmcrg-_AeNABsCtx9wn42dlopfsQKm0thPB_3_bNOK5tAS0MxXb4TdvCNfhD-0rnNYRbvMDbqmlrwK_eufIcnZS6DnDx16fo_eF-t3yMNi-r9XKxiUzM4i4qgTJBlUjyUoo0YVQSLpOCC66kjglTslBlzCQoUnCSpDwfVqNjagTkwE08RTf73Na77x5ClzVVMFDX2oLrQ6YoH3AwygblfK803oXgocxaXzXa_2SUZCPPbOCZjTyzPc_Bcf2XrYPRdem1NVU42FIiFGdj8tVeVwHA4S3TWAka_wIR-XzQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>914654212</pqid></control><display><type>article</type><title>Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof</title><source>IEEE Electronic Library (IEL)</source><creator>Tulino, A.M. ; Verdu, S.</creator><creatorcontrib>Tulino, A.M. ; Verdu, S.</creatorcontrib><description>Artstein, Ball, Barthe, and Naor have recently shown that the non-Gaussianness (divergence with respect to a Gaussian random variable with identical first and second moments) of the sum of independent and identically distributed (i.i.d.) random variables is monotonically nonincreasing. We give a simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels. As Artstein , we also deal with the more general setting of nonidentically distributed random variables</description><identifier>ISSN: 0018-9448</identifier><identifier>EISSN: 1557-9654</identifier><identifier>DOI: 10.1109/TIT.2006.880066</identifier><identifier>CODEN: IETTAW</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Applied sciences ; Central limit theorem ; Channels ; Convolution ; differential entropy ; Divergence ; Entropy ; entropy power inequality ; Errors ; Exact sciences and technology ; Fasteners ; Functional analysis ; Gaussian ; Gaussian channels ; Information theory ; Information, signal and communications theory ; minimum mean-square error (MMSE) ; non-Gaussianness ; Power measurement ; Proving ; Random variables ; relative entropy ; Signal to noise ratio ; Telecommunications and information theory</subject><ispartof>IEEE transactions on information theory, 2006-09, Vol.52 (9), p.4295-4297</ispartof><rights>2006 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c323t-fe1251957bf65872160467d45496a30296d9f326e90d40784b9f3ca31c5ebe4c3</citedby><cites>FETCH-LOGICAL-c323t-fe1251957bf65872160467d45496a30296d9f326e90d40784b9f3ca31c5ebe4c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1683951$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1683951$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=18059422$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Tulino, A.M.</creatorcontrib><creatorcontrib>Verdu, S.</creatorcontrib><title>Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof</title><title>IEEE transactions on information theory</title><addtitle>TIT</addtitle><description>Artstein, Ball, Barthe, and Naor have recently shown that the non-Gaussianness (divergence with respect to a Gaussian random variable with identical first and second moments) of the sum of independent and identically distributed (i.i.d.) random variables is monotonically nonincreasing. We give a simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels. As Artstein , we also deal with the more general setting of nonidentically distributed random variables</description><subject>Applied sciences</subject><subject>Central limit theorem</subject><subject>Channels</subject><subject>Convolution</subject><subject>differential entropy</subject><subject>Divergence</subject><subject>Entropy</subject><subject>entropy power inequality</subject><subject>Errors</subject><subject>Exact sciences and technology</subject><subject>Fasteners</subject><subject>Functional analysis</subject><subject>Gaussian</subject><subject>Gaussian channels</subject><subject>Information theory</subject><subject>Information, signal and communications theory</subject><subject>minimum mean-square error (MMSE)</subject><subject>non-Gaussianness</subject><subject>Power measurement</subject><subject>Proving</subject><subject>Random variables</subject><subject>relative entropy</subject><subject>Signal to noise ratio</subject><subject>Telecommunications and information theory</subject><issn>0018-9448</issn><issn>1557-9654</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2006</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpFkDtPwzAUhS0EEqUwM7B4QUxpbcd2YraqQKlUHqKFNXKcGxGU2MFOBv49iQrqcp_nnOFD6JKSGaVEzXfr3YwRImdpOlR5hCZUiCRSUvBjNCGEppHiPD1FZyF8DSsXlE0QPDnrOmcrg-_AeNABsCtx9wn42dlopfsQKm0thPB_3_bNOK5tAS0MxXb4TdvCNfhD-0rnNYRbvMDbqmlrwK_eufIcnZS6DnDx16fo_eF-t3yMNi-r9XKxiUzM4i4qgTJBlUjyUoo0YVQSLpOCC66kjglTslBlzCQoUnCSpDwfVqNjagTkwE08RTf73Na77x5ClzVVMFDX2oLrQ6YoH3AwygblfK803oXgocxaXzXa_2SUZCPPbOCZjTyzPc_Bcf2XrYPRdem1NVU42FIiFGdj8tVeVwHA4S3TWAka_wIR-XzQ</recordid><startdate>20060901</startdate><enddate>20060901</enddate><creator>Tulino, A.M.</creator><creator>Verdu, S.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20060901</creationdate><title>Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof</title><author>Tulino, A.M. ; Verdu, S.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c323t-fe1251957bf65872160467d45496a30296d9f326e90d40784b9f3ca31c5ebe4c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2006</creationdate><topic>Applied sciences</topic><topic>Central limit theorem</topic><topic>Channels</topic><topic>Convolution</topic><topic>differential entropy</topic><topic>Divergence</topic><topic>Entropy</topic><topic>entropy power inequality</topic><topic>Errors</topic><topic>Exact sciences and technology</topic><topic>Fasteners</topic><topic>Functional analysis</topic><topic>Gaussian</topic><topic>Gaussian channels</topic><topic>Information theory</topic><topic>Information, signal and communications theory</topic><topic>minimum mean-square error (MMSE)</topic><topic>non-Gaussianness</topic><topic>Power measurement</topic><topic>Proving</topic><topic>Random variables</topic><topic>relative entropy</topic><topic>Signal to noise ratio</topic><topic>Telecommunications and information theory</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tulino, A.M.</creatorcontrib><creatorcontrib>Verdu, S.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on information theory</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Tulino, A.M.</au><au>Verdu, S.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof</atitle><jtitle>IEEE transactions on information theory</jtitle><stitle>TIT</stitle><date>2006-09-01</date><risdate>2006</risdate><volume>52</volume><issue>9</issue><spage>4295</spage><epage>4297</epage><pages>4295-4297</pages><issn>0018-9448</issn><eissn>1557-9654</eissn><coden>IETTAW</coden><abstract>Artstein, Ball, Barthe, and Naor have recently shown that the non-Gaussianness (divergence with respect to a Gaussian random variable with identical first and second moments) of the sum of independent and identically distributed (i.i.d.) random variables is monotonically nonincreasing. We give a simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels. As Artstein , we also deal with the more general setting of nonidentically distributed random variables</abstract><cop>New York, NY</cop><pub>IEEE</pub><doi>10.1109/TIT.2006.880066</doi><tpages>3</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0018-9448
ispartof IEEE transactions on information theory, 2006-09, Vol.52 (9), p.4295-4297
issn 0018-9448
1557-9654
language eng
recordid cdi_pascalfrancis_primary_18059422
source IEEE Electronic Library (IEL)
subjects Applied sciences
Central limit theorem
Channels
Convolution
differential entropy
Divergence
Entropy
entropy power inequality
Errors
Exact sciences and technology
Fasteners
Functional analysis
Gaussian
Gaussian channels
Information theory
Information, signal and communications theory
minimum mean-square error (MMSE)
non-Gaussianness
Power measurement
Proving
Random variables
relative entropy
Signal to noise ratio
Telecommunications and information theory
title Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T22%3A44%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Monotonic%20Decrease%20of%20the%20Non-Gaussianness%20of%20the%20Sum%20of%20Independent%20Random%20Variables:%20A%20Simple%20Proof&rft.jtitle=IEEE%20transactions%20on%20information%20theory&rft.au=Tulino,%20A.M.&rft.date=2006-09-01&rft.volume=52&rft.issue=9&rft.spage=4295&rft.epage=4297&rft.pages=4295-4297&rft.issn=0018-9448&rft.eissn=1557-9654&rft.coden=IETTAW&rft_id=info:doi/10.1109/TIT.2006.880066&rft_dat=%3Cproquest_RIE%3E914654212%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=914654212&rft_id=info:pmid/&rft_ieee_id=1683951&rfr_iscdi=true