Recurrent predictive coding models for associative memory employing covariance learning
The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary acco...
Gespeichert in:
Veröffentlicht in: | PLoS computational biology 2023-04, Vol.19 (4), p.e1010719 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 4 |
container_start_page | e1010719 |
container_title | PLoS computational biology |
container_volume | 19 |
creator | Tang, Mufeng Salvatori, Tommaso Millidge, Beren Song, Yuhang Lukasiewicz, Thomas Bogacz, Rafal |
description | The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary account, and that predictive coding underlies the computations supporting AM in the hippocampus. Following this theory, a computational model based on classical hierarchical predictive networks was proposed and was shown to perform well in various AM tasks. However, this fully hierarchical model did not incorporate recurrent connections, an architectural component of the CA3 region of the hippocampus that is crucial for AM. This makes the structure of the model inconsistent with the known connectivity of CA3 and classical recurrent models such as Hopfield Networks, which learn the covariance of inputs through their recurrent connections to perform AM. Earlier PC models that learn the covariance information of inputs explicitly via recurrent connections seem to be a solution to these issues. Here, we show that although these models can perform AM, they do it in an implausible and numerically unstable way. Instead, we propose alternatives to these earlier covariance-learning predictive coding networks, which learn the covariance information implicitly and plausibly, and can use dendritic structures to encode prediction errors. We show analytically that our proposed models are perfectly equivalent to the earlier predictive coding model learning covariance explicitly, and encounter no numerical issues when performing AM tasks in practice. We further show that our models can be combined with hierarchical predictive coding networks to model the hippocampo-neocortical interactions. Our models provide a biologically plausible approach to modelling the hippocampal network, pointing to a potential computational mechanism during hippocampal memory formation and recall, which employs both predictive coding and covariance learning based on the recurrent network structure of the hippocampus. |
doi_str_mv | 10.1371/journal.pcbi.1010719 |
format | Article |
fullrecord | <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2814443995</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A748729406</galeid><doaj_id>oai_doaj_org_article_5b1279552e3d47a79088c231a52f2ace</doaj_id><sourcerecordid>A748729406</sourcerecordid><originalsourceid>FETCH-LOGICAL-c662t-e1a67a1dcde275fdda61daa12c3c776755bf55bbe4eddc855465dc7a47e831f93</originalsourceid><addsrcrecordid>eNqVkk1v1DAQhiMEoqXwDxBE4gKHXeKvODmhquJjpQqkAuJoTexJ8CqJt3ayYv89TjetGtQLsiyPxs-8nhlPkrwk2ZowSd5v3eh7aNc7Xdk1yUgmSfkoOSVCsJVkonh8zz5JnoWwzbJolvnT5ITJTBSCk9Pk1xXq0Xvsh3Tn0Vg92D2m2hnbN2nnDLYhrZ1PIQSnLdzcdtg5f0ix27XuMHHa7cFb6DWmLYLvo-958qSGNuCL-TxLfn76-OPiy-ry2-fNxfnlSuc5HVZIIJdAjDZIpaiNgZwYAEI101LmUoiqjrtCjsboQgieC6MlcIkFI3XJzpLXR92YS1BzT4KiBeGcs7IUkdgcCeNgq3beduAPyoFVNw7nGwV-sLpFJSpCZSkERWa4BFlmRaEpIyBoTUFj1PowvzZWHRod2-ahXYgub3r7WzVur-L_MCoEiQpvZwXvrkcMg-ps0Ni20KMbp8QzUsqCkCnxN_-gD5c3Uw3ECmxfu_iwnkTVueSFpCXP8kitH6DiMthZ7XqsbfQvAt4tAiIz4J-hgTEEtfl-9R_s1yXLj6z2LgSP9V3zSKamub4tUk1zrea5jmGv7jf-Luh2kNlfRTP01Q</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2814443995</pqid></control><display><type>article</type><title>Recurrent predictive coding models for associative memory employing covariance learning</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Public Library of Science (PLoS)</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><creator>Tang, Mufeng ; Salvatori, Tommaso ; Millidge, Beren ; Song, Yuhang ; Lukasiewicz, Thomas ; Bogacz, Rafal</creator><contributor>Wei, Xue-Xin</contributor><creatorcontrib>Tang, Mufeng ; Salvatori, Tommaso ; Millidge, Beren ; Song, Yuhang ; Lukasiewicz, Thomas ; Bogacz, Rafal ; Wei, Xue-Xin</creatorcontrib><description>The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary account, and that predictive coding underlies the computations supporting AM in the hippocampus. Following this theory, a computational model based on classical hierarchical predictive networks was proposed and was shown to perform well in various AM tasks. However, this fully hierarchical model did not incorporate recurrent connections, an architectural component of the CA3 region of the hippocampus that is crucial for AM. This makes the structure of the model inconsistent with the known connectivity of CA3 and classical recurrent models such as Hopfield Networks, which learn the covariance of inputs through their recurrent connections to perform AM. Earlier PC models that learn the covariance information of inputs explicitly via recurrent connections seem to be a solution to these issues. Here, we show that although these models can perform AM, they do it in an implausible and numerically unstable way. Instead, we propose alternatives to these earlier covariance-learning predictive coding networks, which learn the covariance information implicitly and plausibly, and can use dendritic structures to encode prediction errors. We show analytically that our proposed models are perfectly equivalent to the earlier predictive coding model learning covariance explicitly, and encounter no numerical issues when performing AM tasks in practice. We further show that our models can be combined with hierarchical predictive coding networks to model the hippocampo-neocortical interactions. Our models provide a biologically plausible approach to modelling the hippocampal network, pointing to a potential computational mechanism during hippocampal memory formation and recall, which employs both predictive coding and covariance learning based on the recurrent network structure of the hippocampus.</description><identifier>ISSN: 1553-7358</identifier><identifier>ISSN: 1553-734X</identifier><identifier>EISSN: 1553-7358</identifier><identifier>DOI: 10.1371/journal.pcbi.1010719</identifier><identifier>PMID: 37058541</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Analysis ; Associative learning ; Associative memory ; Associative networks (Memory) ; Biology and Life Sciences ; Coding ; Computational neuroscience ; Computer and Information Sciences ; Conditioning, Classical ; Covariance ; Dendritic structure ; Hippocampus ; Hippocampus (Brain) ; Learning ; Mathematical models ; Medicine and Health Sciences ; Memory ; Memory tasks ; Mental Recall ; Models, Neurological ; Nervous system ; Networks ; Neural coding ; Neural networks ; Neurons ; Neurosciences ; Physical Sciences ; Physiological aspects ; Social Sciences</subject><ispartof>PLoS computational biology, 2023-04, Vol.19 (4), p.e1010719</ispartof><rights>Copyright: © 2023 Tang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</rights><rights>COPYRIGHT 2023 Public Library of Science</rights><rights>2023 Tang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2023 Tang et al 2023 Tang et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c662t-e1a67a1dcde275fdda61daa12c3c776755bf55bbe4eddc855465dc7a47e831f93</citedby><cites>FETCH-LOGICAL-c662t-e1a67a1dcde275fdda61daa12c3c776755bf55bbe4eddc855465dc7a47e831f93</cites><orcidid>0000-0002-7999-0291 ; 0000-0002-8994-1661</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10132551/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10132551/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,725,778,782,862,883,2098,2917,23849,27907,27908,53774,53776,79351,79352</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37058541$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Wei, Xue-Xin</contributor><creatorcontrib>Tang, Mufeng</creatorcontrib><creatorcontrib>Salvatori, Tommaso</creatorcontrib><creatorcontrib>Millidge, Beren</creatorcontrib><creatorcontrib>Song, Yuhang</creatorcontrib><creatorcontrib>Lukasiewicz, Thomas</creatorcontrib><creatorcontrib>Bogacz, Rafal</creatorcontrib><title>Recurrent predictive coding models for associative memory employing covariance learning</title><title>PLoS computational biology</title><addtitle>PLoS Comput Biol</addtitle><description>The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary account, and that predictive coding underlies the computations supporting AM in the hippocampus. Following this theory, a computational model based on classical hierarchical predictive networks was proposed and was shown to perform well in various AM tasks. However, this fully hierarchical model did not incorporate recurrent connections, an architectural component of the CA3 region of the hippocampus that is crucial for AM. This makes the structure of the model inconsistent with the known connectivity of CA3 and classical recurrent models such as Hopfield Networks, which learn the covariance of inputs through their recurrent connections to perform AM. Earlier PC models that learn the covariance information of inputs explicitly via recurrent connections seem to be a solution to these issues. Here, we show that although these models can perform AM, they do it in an implausible and numerically unstable way. Instead, we propose alternatives to these earlier covariance-learning predictive coding networks, which learn the covariance information implicitly and plausibly, and can use dendritic structures to encode prediction errors. We show analytically that our proposed models are perfectly equivalent to the earlier predictive coding model learning covariance explicitly, and encounter no numerical issues when performing AM tasks in practice. We further show that our models can be combined with hierarchical predictive coding networks to model the hippocampo-neocortical interactions. Our models provide a biologically plausible approach to modelling the hippocampal network, pointing to a potential computational mechanism during hippocampal memory formation and recall, which employs both predictive coding and covariance learning based on the recurrent network structure of the hippocampus.</description><subject>Analysis</subject><subject>Associative learning</subject><subject>Associative memory</subject><subject>Associative networks (Memory)</subject><subject>Biology and Life Sciences</subject><subject>Coding</subject><subject>Computational neuroscience</subject><subject>Computer and Information Sciences</subject><subject>Conditioning, Classical</subject><subject>Covariance</subject><subject>Dendritic structure</subject><subject>Hippocampus</subject><subject>Hippocampus (Brain)</subject><subject>Learning</subject><subject>Mathematical models</subject><subject>Medicine and Health Sciences</subject><subject>Memory</subject><subject>Memory tasks</subject><subject>Mental Recall</subject><subject>Models, Neurological</subject><subject>Nervous system</subject><subject>Networks</subject><subject>Neural coding</subject><subject>Neural networks</subject><subject>Neurons</subject><subject>Neurosciences</subject><subject>Physical Sciences</subject><subject>Physiological aspects</subject><subject>Social Sciences</subject><issn>1553-7358</issn><issn>1553-734X</issn><issn>1553-7358</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqVkk1v1DAQhiMEoqXwDxBE4gKHXeKvODmhquJjpQqkAuJoTexJ8CqJt3ayYv89TjetGtQLsiyPxs-8nhlPkrwk2ZowSd5v3eh7aNc7Xdk1yUgmSfkoOSVCsJVkonh8zz5JnoWwzbJolvnT5ITJTBSCk9Pk1xXq0Xvsh3Tn0Vg92D2m2hnbN2nnDLYhrZ1PIQSnLdzcdtg5f0ix27XuMHHa7cFb6DWmLYLvo-958qSGNuCL-TxLfn76-OPiy-ry2-fNxfnlSuc5HVZIIJdAjDZIpaiNgZwYAEI101LmUoiqjrtCjsboQgieC6MlcIkFI3XJzpLXR92YS1BzT4KiBeGcs7IUkdgcCeNgq3beduAPyoFVNw7nGwV-sLpFJSpCZSkERWa4BFlmRaEpIyBoTUFj1PowvzZWHRod2-ahXYgub3r7WzVur-L_MCoEiQpvZwXvrkcMg-ps0Ni20KMbp8QzUsqCkCnxN_-gD5c3Uw3ECmxfu_iwnkTVueSFpCXP8kitH6DiMthZ7XqsbfQvAt4tAiIz4J-hgTEEtfl-9R_s1yXLj6z2LgSP9V3zSKamub4tUk1zrea5jmGv7jf-Luh2kNlfRTP01Q</recordid><startdate>20230401</startdate><enddate>20230401</enddate><creator>Tang, Mufeng</creator><creator>Salvatori, Tommaso</creator><creator>Millidge, Beren</creator><creator>Song, Yuhang</creator><creator>Lukasiewicz, Thomas</creator><creator>Bogacz, Rafal</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>ISN</scope><scope>ISR</scope><scope>3V.</scope><scope>7QO</scope><scope>7QP</scope><scope>7TK</scope><scope>7TM</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-7999-0291</orcidid><orcidid>https://orcid.org/0000-0002-8994-1661</orcidid></search><sort><creationdate>20230401</creationdate><title>Recurrent predictive coding models for associative memory employing covariance learning</title><author>Tang, Mufeng ; Salvatori, Tommaso ; Millidge, Beren ; Song, Yuhang ; Lukasiewicz, Thomas ; Bogacz, Rafal</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c662t-e1a67a1dcde275fdda61daa12c3c776755bf55bbe4eddc855465dc7a47e831f93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Analysis</topic><topic>Associative learning</topic><topic>Associative memory</topic><topic>Associative networks (Memory)</topic><topic>Biology and Life Sciences</topic><topic>Coding</topic><topic>Computational neuroscience</topic><topic>Computer and Information Sciences</topic><topic>Conditioning, Classical</topic><topic>Covariance</topic><topic>Dendritic structure</topic><topic>Hippocampus</topic><topic>Hippocampus (Brain)</topic><topic>Learning</topic><topic>Mathematical models</topic><topic>Medicine and Health Sciences</topic><topic>Memory</topic><topic>Memory tasks</topic><topic>Mental Recall</topic><topic>Models, Neurological</topic><topic>Nervous system</topic><topic>Networks</topic><topic>Neural coding</topic><topic>Neural networks</topic><topic>Neurons</topic><topic>Neurosciences</topic><topic>Physical Sciences</topic><topic>Physiological aspects</topic><topic>Social Sciences</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tang, Mufeng</creatorcontrib><creatorcontrib>Salvatori, Tommaso</creatorcontrib><creatorcontrib>Millidge, Beren</creatorcontrib><creatorcontrib>Song, Yuhang</creatorcontrib><creatorcontrib>Lukasiewicz, Thomas</creatorcontrib><creatorcontrib>Bogacz, Rafal</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Canada</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PLoS computational biology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Tang, Mufeng</au><au>Salvatori, Tommaso</au><au>Millidge, Beren</au><au>Song, Yuhang</au><au>Lukasiewicz, Thomas</au><au>Bogacz, Rafal</au><au>Wei, Xue-Xin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Recurrent predictive coding models for associative memory employing covariance learning</atitle><jtitle>PLoS computational biology</jtitle><addtitle>PLoS Comput Biol</addtitle><date>2023-04-01</date><risdate>2023</risdate><volume>19</volume><issue>4</issue><spage>e1010719</spage><pages>e1010719-</pages><issn>1553-7358</issn><issn>1553-734X</issn><eissn>1553-7358</eissn><abstract>The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary account, and that predictive coding underlies the computations supporting AM in the hippocampus. Following this theory, a computational model based on classical hierarchical predictive networks was proposed and was shown to perform well in various AM tasks. However, this fully hierarchical model did not incorporate recurrent connections, an architectural component of the CA3 region of the hippocampus that is crucial for AM. This makes the structure of the model inconsistent with the known connectivity of CA3 and classical recurrent models such as Hopfield Networks, which learn the covariance of inputs through their recurrent connections to perform AM. Earlier PC models that learn the covariance information of inputs explicitly via recurrent connections seem to be a solution to these issues. Here, we show that although these models can perform AM, they do it in an implausible and numerically unstable way. Instead, we propose alternatives to these earlier covariance-learning predictive coding networks, which learn the covariance information implicitly and plausibly, and can use dendritic structures to encode prediction errors. We show analytically that our proposed models are perfectly equivalent to the earlier predictive coding model learning covariance explicitly, and encounter no numerical issues when performing AM tasks in practice. We further show that our models can be combined with hierarchical predictive coding networks to model the hippocampo-neocortical interactions. Our models provide a biologically plausible approach to modelling the hippocampal network, pointing to a potential computational mechanism during hippocampal memory formation and recall, which employs both predictive coding and covariance learning based on the recurrent network structure of the hippocampus.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>37058541</pmid><doi>10.1371/journal.pcbi.1010719</doi><tpages>e1010719</tpages><orcidid>https://orcid.org/0000-0002-7999-0291</orcidid><orcidid>https://orcid.org/0000-0002-8994-1661</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1553-7358 |
ispartof | PLoS computational biology, 2023-04, Vol.19 (4), p.e1010719 |
issn | 1553-7358 1553-734X 1553-7358 |
language | eng |
recordid | cdi_plos_journals_2814443995 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Public Library of Science (PLoS); EZB-FREE-00999 freely available EZB journals; PubMed Central |
subjects | Analysis Associative learning Associative memory Associative networks (Memory) Biology and Life Sciences Coding Computational neuroscience Computer and Information Sciences Conditioning, Classical Covariance Dendritic structure Hippocampus Hippocampus (Brain) Learning Mathematical models Medicine and Health Sciences Memory Memory tasks Mental Recall Models, Neurological Nervous system Networks Neural coding Neural networks Neurons Neurosciences Physical Sciences Physiological aspects Social Sciences |
title | Recurrent predictive coding models for associative memory employing covariance learning |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-16T21%3A25%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Recurrent%20predictive%20coding%20models%20for%20associative%20memory%20employing%20covariance%20learning&rft.jtitle=PLoS%20computational%20biology&rft.au=Tang,%20Mufeng&rft.date=2023-04-01&rft.volume=19&rft.issue=4&rft.spage=e1010719&rft.pages=e1010719-&rft.issn=1553-7358&rft.eissn=1553-7358&rft_id=info:doi/10.1371/journal.pcbi.1010719&rft_dat=%3Cgale_plos_%3EA748729406%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2814443995&rft_id=info:pmid/37058541&rft_galeid=A748729406&rft_doaj_id=oai_doaj_org_article_5b1279552e3d47a79088c231a52f2ace&rfr_iscdi=true |