A programmable neural virtual machine based on a fast store-erase learning rule

We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional com...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2019-11, Vol.119, p.10-30
Hauptverfasser: Katz, Garrett E., Davis, Gregory P., Gentili, Rodolphe J., Reggia, James A.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 30
container_issue
container_start_page 10
container_title Neural networks
container_volume 119
creator Katz, Garrett E.
Davis, Gregory P.
Gentili, Rodolphe J.
Reggia, James A.
description We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.
doi_str_mv 10.1016/j.neunet.2019.07.017
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2268320248</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0893608019302059</els_id><sourcerecordid>2268320248</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-5baa9480ff61bdb7a025eb5d5717ba92770be3020a78096af9c63bd70582735b3</originalsourceid><addsrcrecordid>eNp9kE1L5jAQx4O46LPqNxDJ0UvrJGmT9CKI7LqC4GU9h0k71Tz0RZNW8NsbedSjp4Hh_zLzY-xUQClA6IttOdE60VJKEE0JpgRh9thGWNMU0li5zzZgG1VosHDIfqe0BQBtK3XADpVQRmtVb9j9FX-O82PEcUQ_EM-ZEQf-GuKy5jli-xQm4h4TdXyeOPIe08LTMkcqKOY1HwjjFKZHHteBjtmvHodEJ5_ziD38_fP_-l9xd39ze311V7QV2KWoPWJTWeh7LXznDYKsydddbYTx2EhjwJMCCWgsNBr7ptXKdwZqK42qvTpi57vcfP3LSmlxY0gtDQNONK_JSamtkiArm6XVTtrGOaVIvXuOYcT45gS4D5Ru63Yo3QdKB8ZllNl29tmw-pG6b9MXuyy43Ako__kaKLrUBppa6kKkdnHdHH5ueAdv9Yar</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2268320248</pqid></control><display><type>article</type><title>A programmable neural virtual machine based on a fast store-erase learning rule</title><source>MEDLINE</source><source>ScienceDirect Journals (5 years ago - present)</source><creator>Katz, Garrett E. ; Davis, Gregory P. ; Gentili, Rodolphe J. ; Reggia, James A.</creator><creatorcontrib>Katz, Garrett E. ; Davis, Gregory P. ; Gentili, Rodolphe J. ; Reggia, James A.</creatorcontrib><description>We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2019.07.017</identifier><identifier>PMID: 31376635</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Computers ; Humans ; Itinerant attractor dynamics ; Learning ; Local learning ; Machine Learning ; Multiplicative gating ; Neural Networks, Computer ; Programmable neural networks ; Symbolic processing</subject><ispartof>Neural networks, 2019-11, Vol.119, p.10-30</ispartof><rights>2019 Elsevier Ltd</rights><rights>Copyright © 2019 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-5baa9480ff61bdb7a025eb5d5717ba92770be3020a78096af9c63bd70582735b3</citedby><cites>FETCH-LOGICAL-c408t-5baa9480ff61bdb7a025eb5d5717ba92770be3020a78096af9c63bd70582735b3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neunet.2019.07.017$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3548,27923,27924,45994</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31376635$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Katz, Garrett E.</creatorcontrib><creatorcontrib>Davis, Gregory P.</creatorcontrib><creatorcontrib>Gentili, Rodolphe J.</creatorcontrib><creatorcontrib>Reggia, James A.</creatorcontrib><title>A programmable neural virtual machine based on a fast store-erase learning rule</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.</description><subject>Computers</subject><subject>Humans</subject><subject>Itinerant attractor dynamics</subject><subject>Learning</subject><subject>Local learning</subject><subject>Machine Learning</subject><subject>Multiplicative gating</subject><subject>Neural Networks, Computer</subject><subject>Programmable neural networks</subject><subject>Symbolic processing</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kE1L5jAQx4O46LPqNxDJ0UvrJGmT9CKI7LqC4GU9h0k71Tz0RZNW8NsbedSjp4Hh_zLzY-xUQClA6IttOdE60VJKEE0JpgRh9thGWNMU0li5zzZgG1VosHDIfqe0BQBtK3XADpVQRmtVb9j9FX-O82PEcUQ_EM-ZEQf-GuKy5jli-xQm4h4TdXyeOPIe08LTMkcqKOY1HwjjFKZHHteBjtmvHodEJ5_ziD38_fP_-l9xd39ze311V7QV2KWoPWJTWeh7LXznDYKsydddbYTx2EhjwJMCCWgsNBr7ptXKdwZqK42qvTpi57vcfP3LSmlxY0gtDQNONK_JSamtkiArm6XVTtrGOaVIvXuOYcT45gS4D5Ru63Yo3QdKB8ZllNl29tmw-pG6b9MXuyy43Ako__kaKLrUBppa6kKkdnHdHH5ueAdv9Yar</recordid><startdate>201911</startdate><enddate>201911</enddate><creator>Katz, Garrett E.</creator><creator>Davis, Gregory P.</creator><creator>Gentili, Rodolphe J.</creator><creator>Reggia, James A.</creator><general>Elsevier Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>201911</creationdate><title>A programmable neural virtual machine based on a fast store-erase learning rule</title><author>Katz, Garrett E. ; Davis, Gregory P. ; Gentili, Rodolphe J. ; Reggia, James A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-5baa9480ff61bdb7a025eb5d5717ba92770be3020a78096af9c63bd70582735b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computers</topic><topic>Humans</topic><topic>Itinerant attractor dynamics</topic><topic>Learning</topic><topic>Local learning</topic><topic>Machine Learning</topic><topic>Multiplicative gating</topic><topic>Neural Networks, Computer</topic><topic>Programmable neural networks</topic><topic>Symbolic processing</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Katz, Garrett E.</creatorcontrib><creatorcontrib>Davis, Gregory P.</creatorcontrib><creatorcontrib>Gentili, Rodolphe J.</creatorcontrib><creatorcontrib>Reggia, James A.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Katz, Garrett E.</au><au>Davis, Gregory P.</au><au>Gentili, Rodolphe J.</au><au>Reggia, James A.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A programmable neural virtual machine based on a fast store-erase learning rule</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2019-11</date><risdate>2019</risdate><volume>119</volume><spage>10</spage><epage>30</epage><pages>10-30</pages><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>We present a neural architecture that uses a novel local learning rule to represent and execute arbitrary, symbolic programs written in a conventional assembly-like language. This Neural Virtual Machine (NVM) is purely neurocomputational but supports all of the key functionality of a traditional computer architecture. Unlike other programmable neural networks, the NVM uses principles such as fast non-iterative local learning, distributed representation of information, program-independent circuitry, itinerant attractor dynamics, and multiplicative gating for both activity and plasticity. We present the NVM in detail, theoretically analyze its properties, and conduct empirical computer experiments that quantify its performance and demonstrate that it works effectively.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>31376635</pmid><doi>10.1016/j.neunet.2019.07.017</doi><tpages>21</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0893-6080
ispartof Neural networks, 2019-11, Vol.119, p.10-30
issn 0893-6080
1879-2782
language eng
recordid cdi_proquest_miscellaneous_2268320248
source MEDLINE; ScienceDirect Journals (5 years ago - present)
subjects Computers
Humans
Itinerant attractor dynamics
Learning
Local learning
Machine Learning
Multiplicative gating
Neural Networks, Computer
Programmable neural networks
Symbolic processing
title A programmable neural virtual machine based on a fast store-erase learning rule
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T08%3A00%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20programmable%20neural%20virtual%20machine%20based%20on%20a%20fast%20store-erase%20learning%20rule&rft.jtitle=Neural%20networks&rft.au=Katz,%20Garrett%20E.&rft.date=2019-11&rft.volume=119&rft.spage=10&rft.epage=30&rft.pages=10-30&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2019.07.017&rft_dat=%3Cproquest_cross%3E2268320248%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2268320248&rft_id=info:pmid/31376635&rft_els_id=S0893608019302059&rfr_iscdi=true