Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE)
We present a framework based on iterative free-energy optimization with spiking neural networks for modeling the fronto-striatal system (PFC-BG) for the generation and recall of audio memory sequences. In line with neuroimaging studies carried out in the PFC, we propose a genuine coding strategy usi...
Gespeichert in:
Veröffentlicht in: | Neural networks 2020-01, Vol.121, p.242-258 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 258 |
---|---|
container_issue | |
container_start_page | 242 |
container_title | Neural networks |
container_volume | 121 |
creator | Pitti, Alexandre Quoy, Mathias Lavandier, Catherine Boucenna, Sofiane |
description | We present a framework based on iterative free-energy optimization with spiking neural networks for modeling the fronto-striatal system (PFC-BG) for the generation and recall of audio memory sequences. In line with neuroimaging studies carried out in the PFC, we propose a genuine coding strategy using the gain-modulation mechanism to represent abstract sequences based solely on the rank and location of items within them. Based on this mechanism, we show that we can construct a repertoire of neurons sensitive to the temporal structure in sequences from which we can represent any novel sequences. Free-energy optimization is then used to explore and to retrieve the missing indices of the items in the correct order for executive control and compositionality. We show that the gain-modulation mechanism permits the network to be robust to variabilities and to have long-term dependencies as it implements a gated recurrent neural network. This model, called Inferno Gate, is an extension of the neural architecture Inferno standing for Iterative Free-Energy Optimization of Recurrent Neural Networks with Gating or Gain-modulation. In experiments performed with an audio database of ten thousand MFCC vectors, Inferno Gate is capable of encoding efficiently and retrieving chunks of fifty items length. We then discuss the potential of our network to model the features of working memory in the PFC-BG loop for structural learning, goal-direction and hierarchical reinforcement learning.
•We present a Neural architecture based on Iterative Free-Energy Optimization and on a Gating mechanism for sequence encoding, generation and retrieving in spiking Recurrent Networks (Inferno Gate).•We present a novel gating mechanism based on rank-order coding for structure learning.•The gated network extracts temporal structure from feature to encode memory sequences.•It represents sequences based solely on the rank and location of items, not the items.•Experiments show fast learning and efficient retrieving of long memory sequences.•Its features are compositionality, robustness to noise and long-term dependencies. |
doi_str_mv | 10.1016/j.neunet.2019.09.023 |
format | Article |
fullrecord | <record><control><sourceid>proquest_hal_p</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_02309379v2</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S089360801930303X</els_id><sourcerecordid>2301432992</sourcerecordid><originalsourceid>FETCH-LOGICAL-c508t-c2d507c5b954b615803260e2123e2283162d7ebf49ece6f2d4bfdf0b7efe187d3</originalsourceid><addsrcrecordid>eNp9kd1qGzEQhUVpady0b1CKLpOLdfWzvzcFE2zHYGIo6bXQSrOp7F3JlbQO7nv0favFaS4LgoHRN3OkcxD6TMmcElp-3c8tjBbinBHazEk6jL9BM1pXTcaqmr1FM1I3PCtJTa7QhxD2hJCyzvl7dMVpUVNSFjP0Zy0jaByO5mDsE04rvexTic_OH_AYpuYmgpfRnACvPEC2tOCfznh3jGYwv9OFs1hajb20h8x5DR4rp6fBznkcoh9VHD3gHqS3U9tYPMDg_BkH-DWCVRDwzeZhtfz-sMPrxePy9iN618k-wKeXeo1-rJaPd_fZdrfe3C22mSpIHTPFdEEqVbRNkbdl-hPhrCTAKOPAWM1pyXQFbZc3oKDsmM7bTnekraCD5JPm1-j2sven7MXRm0H6s3DSiPvFVky95ClpeNWcWGJvLuzRu_TqEMVggoK-lxbcGEQiac5Z00xofkGVdyF46F53UyKm8MReXMITU3iCNJNQGvvyojC2A-jXoX9pJeDbBYDkycmAF0GZyT9tPKgotDP_V_gLI-auDw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2301432992</pqid></control><display><type>article</type><title>Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE)</title><source>MEDLINE</source><source>ScienceDirect Journals (5 years ago - present)</source><creator>Pitti, Alexandre ; Quoy, Mathias ; Lavandier, Catherine ; Boucenna, Sofiane</creator><creatorcontrib>Pitti, Alexandre ; Quoy, Mathias ; Lavandier, Catherine ; Boucenna, Sofiane</creatorcontrib><description>We present a framework based on iterative free-energy optimization with spiking neural networks for modeling the fronto-striatal system (PFC-BG) for the generation and recall of audio memory sequences. In line with neuroimaging studies carried out in the PFC, we propose a genuine coding strategy using the gain-modulation mechanism to represent abstract sequences based solely on the rank and location of items within them. Based on this mechanism, we show that we can construct a repertoire of neurons sensitive to the temporal structure in sequences from which we can represent any novel sequences. Free-energy optimization is then used to explore and to retrieve the missing indices of the items in the correct order for executive control and compositionality. We show that the gain-modulation mechanism permits the network to be robust to variabilities and to have long-term dependencies as it implements a gated recurrent neural network. This model, called Inferno Gate, is an extension of the neural architecture Inferno standing for Iterative Free-Energy Optimization of Recurrent Neural Networks with Gating or Gain-modulation. In experiments performed with an audio database of ten thousand MFCC vectors, Inferno Gate is capable of encoding efficiently and retrieving chunks of fifty items length. We then discuss the potential of our network to model the features of working memory in the PFC-BG loop for structural learning, goal-direction and hierarchical reinforcement learning.
•We present a Neural architecture based on Iterative Free-Energy Optimization and on a Gating mechanism for sequence encoding, generation and retrieving in spiking Recurrent Networks (Inferno Gate).•We present a novel gating mechanism based on rank-order coding for structure learning.•The gated network extracts temporal structure from feature to encode memory sequences.•It represents sequences based solely on the rank and location of items, not the items.•Experiments show fast learning and efficient retrieving of long memory sequences.•Its features are compositionality, robustness to noise and long-term dependencies.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2019.09.023</identifier><identifier>PMID: 31581065</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Action Potentials - physiology ; Artificial Intelligence ; Cognitive science ; Computer Science ; Free-energy ; Gating mechanism ; Humans ; Language development ; Learning - physiology ; Memory, Short-Term - physiology ; Mental Recall - physiology ; Neural and Evolutionary Computing ; Neural Networks, Computer ; Neurons - physiology ; Neuroscience ; Prefrontal cortex ; Prefrontal Cortex - physiology ; Rank-order coding ; Reinforcement, Psychology ; Robotics ; Structural learning</subject><ispartof>Neural networks, 2020-01, Vol.121, p.242-258</ispartof><rights>2019 Elsevier Ltd</rights><rights>Copyright © 2019 Elsevier Ltd. All rights reserved.</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c508t-c2d507c5b954b615803260e2123e2283162d7ebf49ece6f2d4bfdf0b7efe187d3</citedby><cites>FETCH-LOGICAL-c508t-c2d507c5b954b615803260e2123e2283162d7ebf49ece6f2d4bfdf0b7efe187d3</cites><orcidid>0000-0001-6049-5613 ; 0000-0002-6541-578X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neunet.2019.09.023$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>230,314,780,784,885,3548,4022,27922,27923,27924,45994</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31581065$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink><backlink>$$Uhttps://hal.science/hal-02309379$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Pitti, Alexandre</creatorcontrib><creatorcontrib>Quoy, Mathias</creatorcontrib><creatorcontrib>Lavandier, Catherine</creatorcontrib><creatorcontrib>Boucenna, Sofiane</creatorcontrib><title>Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE)</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>We present a framework based on iterative free-energy optimization with spiking neural networks for modeling the fronto-striatal system (PFC-BG) for the generation and recall of audio memory sequences. In line with neuroimaging studies carried out in the PFC, we propose a genuine coding strategy using the gain-modulation mechanism to represent abstract sequences based solely on the rank and location of items within them. Based on this mechanism, we show that we can construct a repertoire of neurons sensitive to the temporal structure in sequences from which we can represent any novel sequences. Free-energy optimization is then used to explore and to retrieve the missing indices of the items in the correct order for executive control and compositionality. We show that the gain-modulation mechanism permits the network to be robust to variabilities and to have long-term dependencies as it implements a gated recurrent neural network. This model, called Inferno Gate, is an extension of the neural architecture Inferno standing for Iterative Free-Energy Optimization of Recurrent Neural Networks with Gating or Gain-modulation. In experiments performed with an audio database of ten thousand MFCC vectors, Inferno Gate is capable of encoding efficiently and retrieving chunks of fifty items length. We then discuss the potential of our network to model the features of working memory in the PFC-BG loop for structural learning, goal-direction and hierarchical reinforcement learning.
•We present a Neural architecture based on Iterative Free-Energy Optimization and on a Gating mechanism for sequence encoding, generation and retrieving in spiking Recurrent Networks (Inferno Gate).•We present a novel gating mechanism based on rank-order coding for structure learning.•The gated network extracts temporal structure from feature to encode memory sequences.•It represents sequences based solely on the rank and location of items, not the items.•Experiments show fast learning and efficient retrieving of long memory sequences.•Its features are compositionality, robustness to noise and long-term dependencies.</description><subject>Action Potentials - physiology</subject><subject>Artificial Intelligence</subject><subject>Cognitive science</subject><subject>Computer Science</subject><subject>Free-energy</subject><subject>Gating mechanism</subject><subject>Humans</subject><subject>Language development</subject><subject>Learning - physiology</subject><subject>Memory, Short-Term - physiology</subject><subject>Mental Recall - physiology</subject><subject>Neural and Evolutionary Computing</subject><subject>Neural Networks, Computer</subject><subject>Neurons - physiology</subject><subject>Neuroscience</subject><subject>Prefrontal cortex</subject><subject>Prefrontal Cortex - physiology</subject><subject>Rank-order coding</subject><subject>Reinforcement, Psychology</subject><subject>Robotics</subject><subject>Structural learning</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kd1qGzEQhUVpady0b1CKLpOLdfWzvzcFE2zHYGIo6bXQSrOp7F3JlbQO7nv0favFaS4LgoHRN3OkcxD6TMmcElp-3c8tjBbinBHazEk6jL9BM1pXTcaqmr1FM1I3PCtJTa7QhxD2hJCyzvl7dMVpUVNSFjP0Zy0jaByO5mDsE04rvexTic_OH_AYpuYmgpfRnACvPEC2tOCfznh3jGYwv9OFs1hajb20h8x5DR4rp6fBznkcoh9VHD3gHqS3U9tYPMDg_BkH-DWCVRDwzeZhtfz-sMPrxePy9iN618k-wKeXeo1-rJaPd_fZdrfe3C22mSpIHTPFdEEqVbRNkbdl-hPhrCTAKOPAWM1pyXQFbZc3oKDsmM7bTnekraCD5JPm1-j2sven7MXRm0H6s3DSiPvFVky95ClpeNWcWGJvLuzRu_TqEMVggoK-lxbcGEQiac5Z00xofkGVdyF46F53UyKm8MReXMITU3iCNJNQGvvyojC2A-jXoX9pJeDbBYDkycmAF0GZyT9tPKgotDP_V_gLI-auDw</recordid><startdate>202001</startdate><enddate>202001</enddate><creator>Pitti, Alexandre</creator><creator>Quoy, Mathias</creator><creator>Lavandier, Catherine</creator><creator>Boucenna, Sofiane</creator><general>Elsevier Ltd</general><general>Elsevier</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>1XC</scope><scope>VOOES</scope><orcidid>https://orcid.org/0000-0001-6049-5613</orcidid><orcidid>https://orcid.org/0000-0002-6541-578X</orcidid></search><sort><creationdate>202001</creationdate><title>Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE)</title><author>Pitti, Alexandre ; Quoy, Mathias ; Lavandier, Catherine ; Boucenna, Sofiane</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c508t-c2d507c5b954b615803260e2123e2283162d7ebf49ece6f2d4bfdf0b7efe187d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Action Potentials - physiology</topic><topic>Artificial Intelligence</topic><topic>Cognitive science</topic><topic>Computer Science</topic><topic>Free-energy</topic><topic>Gating mechanism</topic><topic>Humans</topic><topic>Language development</topic><topic>Learning - physiology</topic><topic>Memory, Short-Term - physiology</topic><topic>Mental Recall - physiology</topic><topic>Neural and Evolutionary Computing</topic><topic>Neural Networks, Computer</topic><topic>Neurons - physiology</topic><topic>Neuroscience</topic><topic>Prefrontal cortex</topic><topic>Prefrontal Cortex - physiology</topic><topic>Rank-order coding</topic><topic>Reinforcement, Psychology</topic><topic>Robotics</topic><topic>Structural learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Pitti, Alexandre</creatorcontrib><creatorcontrib>Quoy, Mathias</creatorcontrib><creatorcontrib>Lavandier, Catherine</creatorcontrib><creatorcontrib>Boucenna, Sofiane</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Pitti, Alexandre</au><au>Quoy, Mathias</au><au>Lavandier, Catherine</au><au>Boucenna, Sofiane</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE)</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2020-01</date><risdate>2020</risdate><volume>121</volume><spage>242</spage><epage>258</epage><pages>242-258</pages><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>We present a framework based on iterative free-energy optimization with spiking neural networks for modeling the fronto-striatal system (PFC-BG) for the generation and recall of audio memory sequences. In line with neuroimaging studies carried out in the PFC, we propose a genuine coding strategy using the gain-modulation mechanism to represent abstract sequences based solely on the rank and location of items within them. Based on this mechanism, we show that we can construct a repertoire of neurons sensitive to the temporal structure in sequences from which we can represent any novel sequences. Free-energy optimization is then used to explore and to retrieve the missing indices of the items in the correct order for executive control and compositionality. We show that the gain-modulation mechanism permits the network to be robust to variabilities and to have long-term dependencies as it implements a gated recurrent neural network. This model, called Inferno Gate, is an extension of the neural architecture Inferno standing for Iterative Free-Energy Optimization of Recurrent Neural Networks with Gating or Gain-modulation. In experiments performed with an audio database of ten thousand MFCC vectors, Inferno Gate is capable of encoding efficiently and retrieving chunks of fifty items length. We then discuss the potential of our network to model the features of working memory in the PFC-BG loop for structural learning, goal-direction and hierarchical reinforcement learning.
•We present a Neural architecture based on Iterative Free-Energy Optimization and on a Gating mechanism for sequence encoding, generation and retrieving in spiking Recurrent Networks (Inferno Gate).•We present a novel gating mechanism based on rank-order coding for structure learning.•The gated network extracts temporal structure from feature to encode memory sequences.•It represents sequences based solely on the rank and location of items, not the items.•Experiments show fast learning and efficient retrieving of long memory sequences.•Its features are compositionality, robustness to noise and long-term dependencies.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>31581065</pmid><doi>10.1016/j.neunet.2019.09.023</doi><tpages>17</tpages><orcidid>https://orcid.org/0000-0001-6049-5613</orcidid><orcidid>https://orcid.org/0000-0002-6541-578X</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0893-6080 |
ispartof | Neural networks, 2020-01, Vol.121, p.242-258 |
issn | 0893-6080 1879-2782 |
language | eng |
recordid | cdi_hal_primary_oai_HAL_hal_02309379v2 |
source | MEDLINE; ScienceDirect Journals (5 years ago - present) |
subjects | Action Potentials - physiology Artificial Intelligence Cognitive science Computer Science Free-energy Gating mechanism Humans Language development Learning - physiology Memory, Short-Term - physiology Mental Recall - physiology Neural and Evolutionary Computing Neural Networks, Computer Neurons - physiology Neuroscience Prefrontal cortex Prefrontal Cortex - physiology Rank-order coding Reinforcement, Psychology Robotics Structural learning |
title | Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE) |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T07%3A24%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Gated%20spiking%20neural%20network%20using%20Iterative%20Free-Energy%20Optimization%20and%20rank-order%20coding%20for%20structure%20learning%20in%20memory%20sequences%20(INFERNO%20GATE)&rft.jtitle=Neural%20networks&rft.au=Pitti,%20Alexandre&rft.date=2020-01&rft.volume=121&rft.spage=242&rft.epage=258&rft.pages=242-258&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2019.09.023&rft_dat=%3Cproquest_hal_p%3E2301432992%3C/proquest_hal_p%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2301432992&rft_id=info:pmid/31581065&rft_els_id=S089360801930303X&rfr_iscdi=true |