Improved lower bound for the mutual information between signal and neural spike count
The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound f...
Gespeichert in:
Veröffentlicht in: | Biological cybernetics 2018-12, Vol.112 (6), p.523-538 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 538 |
---|---|
container_issue | 6 |
container_start_page | 523 |
container_title | Biological cybernetics |
container_volume | 112 |
creator | Voronenko, Sergej O. Lindner, Benjamin |
description | The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound for the mutual information by incorporating nonlinear correlations. For the special case of a Gaussian output variable with nonlinear signal dependencies of mean and variance we also derive an exact integral formula for the full mutual information. In our numerical analysis, we first compare the linear and nonlinear lower bounds and the exact integral formula for two different Gaussian models and show under which conditions the nonlinear lower bound provides a significant improvement to the linear approximation. We then inspect two neuron models, the leaky integrate-and-fire model with white Gaussian noise and the Na–K model with channel noise. We show that for certain firing regimes and for intermediate signal strengths the nonlinear lower bound can provide a substantial improvement compared to the linear lower bound. Our results demonstrate the importance of nonlinear input–output correlations for neural information transmission and provide a simple nonlinear approximation for the mutual information that can be applied to more complicated neuron models as well as to experimental data. |
doi_str_mv | 10.1007/s00422-018-0779-5 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2096550919</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2094963317</sourcerecordid><originalsourceid>FETCH-LOGICAL-c372t-8ce0f2c62054c758dbb830f600e38f5c32029c6fe1d5797ba03bd0d09ea8118c3</originalsourceid><addsrcrecordid>eNp1kMtOwzAQRS0EoqXwAWyQJTZsAmM7juMlqnhUqsSGrq3EmZSUPIqdUPH3uLSAhMTKjzlzR3MIOWdwzQDUjQeIOY-ApREopSN5QMYsFnz7gkMyBhFDxDjAiJx4vwIAzaU-JiMBTMpE6zFZzJq1696xoHW3QUfzbmgLWnaO9i9Im6EfsppWbfhosr7qWppjv0Fsqa-WbShlgW5xcOHq19UrUhsC-lNyVGa1x7P9OSGL-7vn6WM0f3qYTW_nkRWK91FqEUpuEw4ytkqmRZ6nAsoEAEVaSis4cG2TElkhlVZ5BiIvoACNWcpYasWEXO1yww5vA_reNJW3WNdZi93gDQedSAma6YBe_kFX3eDCCl9UrBMhmAoU21HWdd47LM3aVU3mPgwDs3Vuds5NcG62zo0MPRf75CFvsPjp-JYcAL4DfCi1S3S_o_9P_QQ0FIvA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2094963317</pqid></control><display><type>article</type><title>Improved lower bound for the mutual information between signal and neural spike count</title><source>SpringerLink Journals - AutoHoldings</source><creator>Voronenko, Sergej O. ; Lindner, Benjamin</creator><creatorcontrib>Voronenko, Sergej O. ; Lindner, Benjamin</creatorcontrib><description>The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound for the mutual information by incorporating nonlinear correlations. For the special case of a Gaussian output variable with nonlinear signal dependencies of mean and variance we also derive an exact integral formula for the full mutual information. In our numerical analysis, we first compare the linear and nonlinear lower bounds and the exact integral formula for two different Gaussian models and show under which conditions the nonlinear lower bound provides a significant improvement to the linear approximation. We then inspect two neuron models, the leaky integrate-and-fire model with white Gaussian noise and the Na–K model with channel noise. We show that for certain firing regimes and for intermediate signal strengths the nonlinear lower bound can provide a substantial improvement compared to the linear lower bound. Our results demonstrate the importance of nonlinear input–output correlations for neural information transmission and provide a simple nonlinear approximation for the mutual information that can be applied to more complicated neuron models as well as to experimental data.</description><identifier>ISSN: 0340-1200</identifier><identifier>EISSN: 1432-0770</identifier><identifier>DOI: 10.1007/s00422-018-0779-5</identifier><identifier>PMID: 30155699</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Approximation ; Bioinformatics ; Biomedical and Life Sciences ; Biomedicine ; Channel noise ; Communication channels ; Complex Systems ; Computer Appl. in Life Sciences ; Correlation ; Firing pattern ; Information processing ; Integrals ; Lower bounds ; Mathematical analysis ; Mathematical models ; Neurobiology ; Neurosciences ; Numerical analysis ; Original Article ; Random noise</subject><ispartof>Biological cybernetics, 2018-12, Vol.112 (6), p.523-538</ispartof><rights>Springer-Verlag GmbH Germany, part of Springer Nature 2018</rights><rights>Biological Cybernetics is a copyright of Springer, (2018). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c372t-8ce0f2c62054c758dbb830f600e38f5c32029c6fe1d5797ba03bd0d09ea8118c3</citedby><cites>FETCH-LOGICAL-c372t-8ce0f2c62054c758dbb830f600e38f5c32029c6fe1d5797ba03bd0d09ea8118c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s00422-018-0779-5$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s00422-018-0779-5$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30155699$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Voronenko, Sergej O.</creatorcontrib><creatorcontrib>Lindner, Benjamin</creatorcontrib><title>Improved lower bound for the mutual information between signal and neural spike count</title><title>Biological cybernetics</title><addtitle>Biol Cybern</addtitle><addtitle>Biol Cybern</addtitle><description>The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound for the mutual information by incorporating nonlinear correlations. For the special case of a Gaussian output variable with nonlinear signal dependencies of mean and variance we also derive an exact integral formula for the full mutual information. In our numerical analysis, we first compare the linear and nonlinear lower bounds and the exact integral formula for two different Gaussian models and show under which conditions the nonlinear lower bound provides a significant improvement to the linear approximation. We then inspect two neuron models, the leaky integrate-and-fire model with white Gaussian noise and the Na–K model with channel noise. We show that for certain firing regimes and for intermediate signal strengths the nonlinear lower bound can provide a substantial improvement compared to the linear lower bound. Our results demonstrate the importance of nonlinear input–output correlations for neural information transmission and provide a simple nonlinear approximation for the mutual information that can be applied to more complicated neuron models as well as to experimental data.</description><subject>Approximation</subject><subject>Bioinformatics</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedicine</subject><subject>Channel noise</subject><subject>Communication channels</subject><subject>Complex Systems</subject><subject>Computer Appl. in Life Sciences</subject><subject>Correlation</subject><subject>Firing pattern</subject><subject>Information processing</subject><subject>Integrals</subject><subject>Lower bounds</subject><subject>Mathematical analysis</subject><subject>Mathematical models</subject><subject>Neurobiology</subject><subject>Neurosciences</subject><subject>Numerical analysis</subject><subject>Original Article</subject><subject>Random noise</subject><issn>0340-1200</issn><issn>1432-0770</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp1kMtOwzAQRS0EoqXwAWyQJTZsAmM7juMlqnhUqsSGrq3EmZSUPIqdUPH3uLSAhMTKjzlzR3MIOWdwzQDUjQeIOY-ApREopSN5QMYsFnz7gkMyBhFDxDjAiJx4vwIAzaU-JiMBTMpE6zFZzJq1696xoHW3QUfzbmgLWnaO9i9Im6EfsppWbfhosr7qWppjv0Fsqa-WbShlgW5xcOHq19UrUhsC-lNyVGa1x7P9OSGL-7vn6WM0f3qYTW_nkRWK91FqEUpuEw4ytkqmRZ6nAsoEAEVaSis4cG2TElkhlVZ5BiIvoACNWcpYasWEXO1yww5vA_reNJW3WNdZi93gDQedSAma6YBe_kFX3eDCCl9UrBMhmAoU21HWdd47LM3aVU3mPgwDs3Vuds5NcG62zo0MPRf75CFvsPjp-JYcAL4DfCi1S3S_o_9P_QQ0FIvA</recordid><startdate>20181201</startdate><enddate>20181201</enddate><creator>Voronenko, Sergej O.</creator><creator>Lindner, Benjamin</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QO</scope><scope>7TK</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88E</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H8D</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>L7M</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7P</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope></search><sort><creationdate>20181201</creationdate><title>Improved lower bound for the mutual information between signal and neural spike count</title><author>Voronenko, Sergej O. ; Lindner, Benjamin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c372t-8ce0f2c62054c758dbb830f600e38f5c32029c6fe1d5797ba03bd0d09ea8118c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Approximation</topic><topic>Bioinformatics</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedicine</topic><topic>Channel noise</topic><topic>Communication channels</topic><topic>Complex Systems</topic><topic>Computer Appl. in Life Sciences</topic><topic>Correlation</topic><topic>Firing pattern</topic><topic>Information processing</topic><topic>Integrals</topic><topic>Lower bounds</topic><topic>Mathematical analysis</topic><topic>Mathematical models</topic><topic>Neurobiology</topic><topic>Neurosciences</topic><topic>Numerical analysis</topic><topic>Original Article</topic><topic>Random noise</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Voronenko, Sergej O.</creatorcontrib><creatorcontrib>Lindner, Benjamin</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Biotechnology Research Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Aerospace Database</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Science Database</collection><collection>Biological Science Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Biological cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Voronenko, Sergej O.</au><au>Lindner, Benjamin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Improved lower bound for the mutual information between signal and neural spike count</atitle><jtitle>Biological cybernetics</jtitle><stitle>Biol Cybern</stitle><addtitle>Biol Cybern</addtitle><date>2018-12-01</date><risdate>2018</risdate><volume>112</volume><issue>6</issue><spage>523</spage><epage>538</epage><pages>523-538</pages><issn>0340-1200</issn><eissn>1432-0770</eissn><abstract>The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound for the mutual information by incorporating nonlinear correlations. For the special case of a Gaussian output variable with nonlinear signal dependencies of mean and variance we also derive an exact integral formula for the full mutual information. In our numerical analysis, we first compare the linear and nonlinear lower bounds and the exact integral formula for two different Gaussian models and show under which conditions the nonlinear lower bound provides a significant improvement to the linear approximation. We then inspect two neuron models, the leaky integrate-and-fire model with white Gaussian noise and the Na–K model with channel noise. We show that for certain firing regimes and for intermediate signal strengths the nonlinear lower bound can provide a substantial improvement compared to the linear lower bound. Our results demonstrate the importance of nonlinear input–output correlations for neural information transmission and provide a simple nonlinear approximation for the mutual information that can be applied to more complicated neuron models as well as to experimental data.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><pmid>30155699</pmid><doi>10.1007/s00422-018-0779-5</doi><tpages>16</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0340-1200 |
ispartof | Biological cybernetics, 2018-12, Vol.112 (6), p.523-538 |
issn | 0340-1200 1432-0770 |
language | eng |
recordid | cdi_proquest_miscellaneous_2096550919 |
source | SpringerLink Journals - AutoHoldings |
subjects | Approximation Bioinformatics Biomedical and Life Sciences Biomedicine Channel noise Communication channels Complex Systems Computer Appl. in Life Sciences Correlation Firing pattern Information processing Integrals Lower bounds Mathematical analysis Mathematical models Neurobiology Neurosciences Numerical analysis Original Article Random noise |
title | Improved lower bound for the mutual information between signal and neural spike count |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T20%3A08%3A23IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Improved%20lower%20bound%20for%20the%20mutual%20information%20between%20signal%20and%20neural%20spike%20count&rft.jtitle=Biological%20cybernetics&rft.au=Voronenko,%20Sergej%20O.&rft.date=2018-12-01&rft.volume=112&rft.issue=6&rft.spage=523&rft.epage=538&rft.pages=523-538&rft.issn=0340-1200&rft.eissn=1432-0770&rft_id=info:doi/10.1007/s00422-018-0779-5&rft_dat=%3Cproquest_cross%3E2094963317%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2094963317&rft_id=info:pmid/30155699&rfr_iscdi=true |