Redistribution of Synaptic Efficacy Supports Stable Pattern Learning in Neural Networks

Markram and Tsodyks, by showing that the elevated synaptic efficacy observed with single-pulse long-term potentiation (LTP) measurements disappears with higher-frequency test pulses, have critically challenged the conventional assumption that LTP reflects a general gain increase. This observed chang...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural computation 2002-04, Vol.14 (4), p.873-888
Hauptverfasser: Carpenter, Gail A., Milenova, Boriana L.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 888
container_issue 4
container_start_page 873
container_title Neural computation
container_volume 14
creator Carpenter, Gail A.
Milenova, Boriana L.
description Markram and Tsodyks, by showing that the elevated synaptic efficacy observed with single-pulse long-term potentiation (LTP) measurements disappears with higher-frequency test pulses, have critically challenged the conventional assumption that LTP reflects a general gain increase. This observed change in frequency dependence during synaptic potentiation is called redistribution of synaptic efficacy (RSE). RSE is here seen as the local realization of a global design principle in a neural network for pattern coding. The underlying computational model posits an adaptive threshold rather than a multiplicative weight as the elementary unit of long-term memory. A distributed instar learning law allows thresholds to increase only monotonically, but adaptation has a bidirectional effect on the model postsynaptic potential. At each synapse, threshold increases implement pattern selectivity via a frequency-dependent signal component, while a complementary frequency-independent component nonspecifically strengthens the path. This synaptic balance produces changes in frequency dependence that are robustly similar to those observed by Markram and Tsodyks. The network design therefore suggests a functional purpose for RSE, which, by helping to bound total memory change, supports a distributed coding scheme that is stable with fast as well as slow learning. Multiplicative weights have served as a cornerstone for models of physiological data and neural systems for decades. Although the model discussed here does not implement detailed physiology of synaptic transmission, its new learning laws operate in a network architecture that suggests how recently discovered synaptic computations such as RSE may help produce new network capabilities such as learning that is fast, stable, and distributed.
doi_str_mv 10.1162/089976602317318992
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1162_089976602317318992</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>71582586</sourcerecordid><originalsourceid>FETCH-LOGICAL-c431t-3b2044d5846674500a59949ae47c52ff05758e1d184a8d3bd35df7e625df90753</originalsourceid><addsrcrecordid>eNp9kM1LHDEchkOp1NX2H-ih5FJvY_M5yRxF_ILFimtpbyGTSUp0NpkmGcv61xvZBQ-Vnl5-8LxPwgvAZ4yOMW7JNyS7TrQtIhQLiutB3oEF5hQ1Uspf78HiBWgqIfbBQc73CKEWI_4B7GPc0bZr-QL8vLWDzyX5fi4-BhgdXG2Cnoo38Mw5b7TZwNU8TTGVDFdF96OFN7oUmwJcWp2CD7-hD_DazkmPNcrfmB7yR7Dn9Jjtp10egh_nZ3enl83y-8XV6cmyMYzi0tCeIMYGLln9JeMIad51rNOWCcOJc4gLLi0esGRaDrQfKB-csC2p0SHB6SE42nqnFP_MNhe19tnYcdTBxjkrgbkkXLYVJFvQpJhzsk5Nya912iiM1Muc6t85a-nLzj73azu8Vnb7VeDrDtDZ6NElHYzPrxzliFdT5Y633NoXdR_nFOoo_3_5_I1CsCY-YuaZohUWQhFEcHVUg3ry01uiZ6VMnlQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>71582586</pqid></control><display><type>article</type><title>Redistribution of Synaptic Efficacy Supports Stable Pattern Learning in Neural Networks</title><source>MEDLINE</source><source>MIT Press Journals</source><creator>Carpenter, Gail A. ; Milenova, Boriana L.</creator><creatorcontrib>Carpenter, Gail A. ; Milenova, Boriana L.</creatorcontrib><description>Markram and Tsodyks, by showing that the elevated synaptic efficacy observed with single-pulse long-term potentiation (LTP) measurements disappears with higher-frequency test pulses, have critically challenged the conventional assumption that LTP reflects a general gain increase. This observed change in frequency dependence during synaptic potentiation is called redistribution of synaptic efficacy (RSE). RSE is here seen as the local realization of a global design principle in a neural network for pattern coding. The underlying computational model posits an adaptive threshold rather than a multiplicative weight as the elementary unit of long-term memory. A distributed instar learning law allows thresholds to increase only monotonically, but adaptation has a bidirectional effect on the model postsynaptic potential. At each synapse, threshold increases implement pattern selectivity via a frequency-dependent signal component, while a complementary frequency-independent component nonspecifically strengthens the path. This synaptic balance produces changes in frequency dependence that are robustly similar to those observed by Markram and Tsodyks. The network design therefore suggests a functional purpose for RSE, which, by helping to bound total memory change, supports a distributed coding scheme that is stable with fast as well as slow learning. Multiplicative weights have served as a cornerstone for models of physiological data and neural systems for decades. Although the model discussed here does not implement detailed physiology of synaptic transmission, its new learning laws operate in a network architecture that suggests how recently discovered synaptic computations such as RSE may help produce new network capabilities such as learning that is fast, stable, and distributed.</description><identifier>ISSN: 0899-7667</identifier><identifier>EISSN: 1530-888X</identifier><identifier>DOI: 10.1162/089976602317318992</identifier><identifier>PMID: 11936965</identifier><language>eng</language><publisher>One Rogers Street, Cambridge, MA 02142-1209, USA: MIT Press</publisher><subject>Algorithms ; Applied sciences ; Artificial Intelligence ; Biological and medical sciences ; Computer science; control theory; systems ; Exact sciences and technology ; Fundamental and applied biological sciences. Psychology ; General aspects ; Learning and adaptive systems ; Long-Term Potentiation ; Mathematics in biology. Statistical analysis. Models. Metrology. Data processing in biology (general aspects) ; Models, Neurological ; Neural Networks (Computer) ; Pattern Recognition, Automated ; Receptors, Presynaptic - physiology ; Synapses - physiology</subject><ispartof>Neural computation, 2002-04, Vol.14 (4), p.873-888</ispartof><rights>2002 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c431t-3b2044d5846674500a59949ae47c52ff05758e1d184a8d3bd35df7e625df90753</citedby><cites>FETCH-LOGICAL-c431t-3b2044d5846674500a59949ae47c52ff05758e1d184a8d3bd35df7e625df90753</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://direct.mit.edu/neco/article/doi/10.1162/089976602317318992$$EHTML$$P50$$Gmit$$H</linktohtml><link.rule.ids>314,780,784,27923,27924,54008,54009</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=13505899$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/11936965$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Carpenter, Gail A.</creatorcontrib><creatorcontrib>Milenova, Boriana L.</creatorcontrib><title>Redistribution of Synaptic Efficacy Supports Stable Pattern Learning in Neural Networks</title><title>Neural computation</title><addtitle>Neural Comput</addtitle><description>Markram and Tsodyks, by showing that the elevated synaptic efficacy observed with single-pulse long-term potentiation (LTP) measurements disappears with higher-frequency test pulses, have critically challenged the conventional assumption that LTP reflects a general gain increase. This observed change in frequency dependence during synaptic potentiation is called redistribution of synaptic efficacy (RSE). RSE is here seen as the local realization of a global design principle in a neural network for pattern coding. The underlying computational model posits an adaptive threshold rather than a multiplicative weight as the elementary unit of long-term memory. A distributed instar learning law allows thresholds to increase only monotonically, but adaptation has a bidirectional effect on the model postsynaptic potential. At each synapse, threshold increases implement pattern selectivity via a frequency-dependent signal component, while a complementary frequency-independent component nonspecifically strengthens the path. This synaptic balance produces changes in frequency dependence that are robustly similar to those observed by Markram and Tsodyks. The network design therefore suggests a functional purpose for RSE, which, by helping to bound total memory change, supports a distributed coding scheme that is stable with fast as well as slow learning. Multiplicative weights have served as a cornerstone for models of physiological data and neural systems for decades. Although the model discussed here does not implement detailed physiology of synaptic transmission, its new learning laws operate in a network architecture that suggests how recently discovered synaptic computations such as RSE may help produce new network capabilities such as learning that is fast, stable, and distributed.</description><subject>Algorithms</subject><subject>Applied sciences</subject><subject>Artificial Intelligence</subject><subject>Biological and medical sciences</subject><subject>Computer science; control theory; systems</subject><subject>Exact sciences and technology</subject><subject>Fundamental and applied biological sciences. Psychology</subject><subject>General aspects</subject><subject>Learning and adaptive systems</subject><subject>Long-Term Potentiation</subject><subject>Mathematics in biology. Statistical analysis. Models. Metrology. Data processing in biology (general aspects)</subject><subject>Models, Neurological</subject><subject>Neural Networks (Computer)</subject><subject>Pattern Recognition, Automated</subject><subject>Receptors, Presynaptic - physiology</subject><subject>Synapses - physiology</subject><issn>0899-7667</issn><issn>1530-888X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2002</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kM1LHDEchkOp1NX2H-ih5FJvY_M5yRxF_ILFimtpbyGTSUp0NpkmGcv61xvZBQ-Vnl5-8LxPwgvAZ4yOMW7JNyS7TrQtIhQLiutB3oEF5hQ1Uspf78HiBWgqIfbBQc73CKEWI_4B7GPc0bZr-QL8vLWDzyX5fi4-BhgdXG2Cnoo38Mw5b7TZwNU8TTGVDFdF96OFN7oUmwJcWp2CD7-hD_DazkmPNcrfmB7yR7Dn9Jjtp10egh_nZ3enl83y-8XV6cmyMYzi0tCeIMYGLln9JeMIad51rNOWCcOJc4gLLi0esGRaDrQfKB-csC2p0SHB6SE42nqnFP_MNhe19tnYcdTBxjkrgbkkXLYVJFvQpJhzsk5Nya912iiM1Muc6t85a-nLzj73azu8Vnb7VeDrDtDZ6NElHYzPrxzliFdT5Y633NoXdR_nFOoo_3_5_I1CsCY-YuaZohUWQhFEcHVUg3ry01uiZ6VMnlQ</recordid><startdate>20020401</startdate><enddate>20020401</enddate><creator>Carpenter, Gail A.</creator><creator>Milenova, Boriana L.</creator><general>MIT Press</general><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>20020401</creationdate><title>Redistribution of Synaptic Efficacy Supports Stable Pattern Learning in Neural Networks</title><author>Carpenter, Gail A. ; Milenova, Boriana L.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c431t-3b2044d5846674500a59949ae47c52ff05758e1d184a8d3bd35df7e625df90753</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2002</creationdate><topic>Algorithms</topic><topic>Applied sciences</topic><topic>Artificial Intelligence</topic><topic>Biological and medical sciences</topic><topic>Computer science; control theory; systems</topic><topic>Exact sciences and technology</topic><topic>Fundamental and applied biological sciences. Psychology</topic><topic>General aspects</topic><topic>Learning and adaptive systems</topic><topic>Long-Term Potentiation</topic><topic>Mathematics in biology. Statistical analysis. Models. Metrology. Data processing in biology (general aspects)</topic><topic>Models, Neurological</topic><topic>Neural Networks (Computer)</topic><topic>Pattern Recognition, Automated</topic><topic>Receptors, Presynaptic - physiology</topic><topic>Synapses - physiology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Carpenter, Gail A.</creatorcontrib><creatorcontrib>Milenova, Boriana L.</creatorcontrib><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Carpenter, Gail A.</au><au>Milenova, Boriana L.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Redistribution of Synaptic Efficacy Supports Stable Pattern Learning in Neural Networks</atitle><jtitle>Neural computation</jtitle><addtitle>Neural Comput</addtitle><date>2002-04-01</date><risdate>2002</risdate><volume>14</volume><issue>4</issue><spage>873</spage><epage>888</epage><pages>873-888</pages><issn>0899-7667</issn><eissn>1530-888X</eissn><abstract>Markram and Tsodyks, by showing that the elevated synaptic efficacy observed with single-pulse long-term potentiation (LTP) measurements disappears with higher-frequency test pulses, have critically challenged the conventional assumption that LTP reflects a general gain increase. This observed change in frequency dependence during synaptic potentiation is called redistribution of synaptic efficacy (RSE). RSE is here seen as the local realization of a global design principle in a neural network for pattern coding. The underlying computational model posits an adaptive threshold rather than a multiplicative weight as the elementary unit of long-term memory. A distributed instar learning law allows thresholds to increase only monotonically, but adaptation has a bidirectional effect on the model postsynaptic potential. At each synapse, threshold increases implement pattern selectivity via a frequency-dependent signal component, while a complementary frequency-independent component nonspecifically strengthens the path. This synaptic balance produces changes in frequency dependence that are robustly similar to those observed by Markram and Tsodyks. The network design therefore suggests a functional purpose for RSE, which, by helping to bound total memory change, supports a distributed coding scheme that is stable with fast as well as slow learning. Multiplicative weights have served as a cornerstone for models of physiological data and neural systems for decades. Although the model discussed here does not implement detailed physiology of synaptic transmission, its new learning laws operate in a network architecture that suggests how recently discovered synaptic computations such as RSE may help produce new network capabilities such as learning that is fast, stable, and distributed.</abstract><cop>One Rogers Street, Cambridge, MA 02142-1209, USA</cop><pub>MIT Press</pub><pmid>11936965</pmid><doi>10.1162/089976602317318992</doi><tpages>16</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0899-7667
ispartof Neural computation, 2002-04, Vol.14 (4), p.873-888
issn 0899-7667
1530-888X
language eng
recordid cdi_crossref_primary_10_1162_089976602317318992
source MEDLINE; MIT Press Journals
subjects Algorithms
Applied sciences
Artificial Intelligence
Biological and medical sciences
Computer science
control theory
systems
Exact sciences and technology
Fundamental and applied biological sciences. Psychology
General aspects
Learning and adaptive systems
Long-Term Potentiation
Mathematics in biology. Statistical analysis. Models. Metrology. Data processing in biology (general aspects)
Models, Neurological
Neural Networks (Computer)
Pattern Recognition, Automated
Receptors, Presynaptic - physiology
Synapses - physiology
title Redistribution of Synaptic Efficacy Supports Stable Pattern Learning in Neural Networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-13T03%3A58%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Redistribution%20of%20Synaptic%20Efficacy%20Supports%20Stable%20Pattern%20Learning%20in%20Neural%20Networks&rft.jtitle=Neural%20computation&rft.au=Carpenter,%20Gail%20A.&rft.date=2002-04-01&rft.volume=14&rft.issue=4&rft.spage=873&rft.epage=888&rft.pages=873-888&rft.issn=0899-7667&rft.eissn=1530-888X&rft_id=info:doi/10.1162/089976602317318992&rft_dat=%3Cproquest_cross%3E71582586%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=71582586&rft_id=info:pmid/11936965&rfr_iscdi=true