Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns
How complex patterns generated by neural systems are represented in individual neuronal activity is an essential problem in computational neuroscience as well as machine learning communities. Here, based on recurrent neural networks in the form of feedback reservoir computers, we show microscopic fe...
Gespeichert in:
Veröffentlicht in: | Chaos (Woodbury, N.Y.) N.Y.), 2023-09, Vol.33 (9) |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 9 |
container_start_page | |
container_title | Chaos (Woodbury, N.Y.) |
container_volume | 33 |
creator | Maslennikov, Oleg V. Gao, Chao Nekorkin, Vladimir I. |
description | How complex patterns generated by neural systems are represented in individual neuronal activity is an essential problem in computational neuroscience as well as machine learning communities. Here, based on recurrent neural networks in the form of feedback reservoir computers, we show microscopic features resulting in generating spatiotemporal patterns including multicluster and chimera states. We show the effect of individual neural trajectories as well as whole-network activity distributions on exhibiting particular regimes. In addition, we address the question how trained output weights contribute to the autonomous multidimensional dynamics. |
doi_str_mv | 10.1063/5.0166359 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1063_5_0166359</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2865905635</sourcerecordid><originalsourceid>FETCH-LOGICAL-c325t-6bbe4533bb1e65813eaa74746d8ee3b4c5d2d0af4f8e124ee660ec41dd99ff7e3</originalsourceid><addsrcrecordid>eNp90LtOwzAUgGELgUQpDLyBJRZASrHjS5IRVVwqVWKB2XKcE5SS2MF2BH17EtKJgcm2zucz_AhdUrKiRLI7sSJUSiaKI7SgJC-STObp8XQXPKGCkFN0FsKOEEJTJhao3NgI3uoWV3uru8YE7GrswQzeg43YwuDHoYX45fxHwNHrxkKFo8PvYMHrCNi4rm_hG4dex8ZF6Ho3_Rlf0-pwjk5q3Qa4OJxL9Pb48Lp-TrYvT5v1_TYxLBUxkWUJXDBWlhSkyCkDrTOecVnlAKzkRlRpRXTN6xxoygGkJGA4raqiqOsM2BJdz3t77z4HCFF1TTDQttqCG4JK8zFMVtCcjPTqD925Yarwq0RBxJhwVDezMt6F4KFWvW867feKEjXVVkIdao_2drbBNHHKYP_BP4qrgQY</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2865905635</pqid></control><display><type>article</type><title>Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns</title><source>AIP Journals Complete</source><source>Alma/SFX Local Collection</source><creator>Maslennikov, Oleg V. ; Gao, Chao ; Nekorkin, Vladimir I.</creator><creatorcontrib>Maslennikov, Oleg V. ; Gao, Chao ; Nekorkin, Vladimir I.</creatorcontrib><description>How complex patterns generated by neural systems are represented in individual neuronal activity is an essential problem in computational neuroscience as well as machine learning communities. Here, based on recurrent neural networks in the form of feedback reservoir computers, we show microscopic features resulting in generating spatiotemporal patterns including multicluster and chimera states. We show the effect of individual neural trajectories as well as whole-network activity distributions on exhibiting particular regimes. In addition, we address the question how trained output weights contribute to the autonomous multidimensional dynamics.</description><identifier>ISSN: 1054-1500</identifier><identifier>EISSN: 1089-7682</identifier><identifier>DOI: 10.1063/5.0166359</identifier><identifier>CODEN: CHAOEH</identifier><language>eng</language><publisher>Melville: American Institute of Physics</publisher><subject>Machine learning ; Recurrent neural networks</subject><ispartof>Chaos (Woodbury, N.Y.), 2023-09, Vol.33 (9)</ispartof><rights>Author(s)</rights><rights>2023 Author(s). Published under an exclusive license by AIP Publishing.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c325t-6bbe4533bb1e65813eaa74746d8ee3b4c5d2d0af4f8e124ee660ec41dd99ff7e3</citedby><cites>FETCH-LOGICAL-c325t-6bbe4533bb1e65813eaa74746d8ee3b4c5d2d0af4f8e124ee660ec41dd99ff7e3</cites><orcidid>0000-0003-0173-587X ; 0000-0002-5865-2285 ; 0000-0002-8909-321X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,794,4512,27924,27925</link.rule.ids></links><search><creatorcontrib>Maslennikov, Oleg V.</creatorcontrib><creatorcontrib>Gao, Chao</creatorcontrib><creatorcontrib>Nekorkin, Vladimir I.</creatorcontrib><title>Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns</title><title>Chaos (Woodbury, N.Y.)</title><description>How complex patterns generated by neural systems are represented in individual neuronal activity is an essential problem in computational neuroscience as well as machine learning communities. Here, based on recurrent neural networks in the form of feedback reservoir computers, we show microscopic features resulting in generating spatiotemporal patterns including multicluster and chimera states. We show the effect of individual neural trajectories as well as whole-network activity distributions on exhibiting particular regimes. In addition, we address the question how trained output weights contribute to the autonomous multidimensional dynamics.</description><subject>Machine learning</subject><subject>Recurrent neural networks</subject><issn>1054-1500</issn><issn>1089-7682</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp90LtOwzAUgGELgUQpDLyBJRZASrHjS5IRVVwqVWKB2XKcE5SS2MF2BH17EtKJgcm2zucz_AhdUrKiRLI7sSJUSiaKI7SgJC-STObp8XQXPKGCkFN0FsKOEEJTJhao3NgI3uoWV3uru8YE7GrswQzeg43YwuDHoYX45fxHwNHrxkKFo8PvYMHrCNi4rm_hG4dex8ZF6Ho3_Rlf0-pwjk5q3Qa4OJxL9Pb48Lp-TrYvT5v1_TYxLBUxkWUJXDBWlhSkyCkDrTOecVnlAKzkRlRpRXTN6xxoygGkJGA4raqiqOsM2BJdz3t77z4HCFF1TTDQttqCG4JK8zFMVtCcjPTqD925Yarwq0RBxJhwVDezMt6F4KFWvW867feKEjXVVkIdao_2drbBNHHKYP_BP4qrgQY</recordid><startdate>202309</startdate><enddate>202309</enddate><creator>Maslennikov, Oleg V.</creator><creator>Gao, Chao</creator><creator>Nekorkin, Vladimir I.</creator><general>American Institute of Physics</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-0173-587X</orcidid><orcidid>https://orcid.org/0000-0002-5865-2285</orcidid><orcidid>https://orcid.org/0000-0002-8909-321X</orcidid></search><sort><creationdate>202309</creationdate><title>Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns</title><author>Maslennikov, Oleg V. ; Gao, Chao ; Nekorkin, Vladimir I.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c325t-6bbe4533bb1e65813eaa74746d8ee3b4c5d2d0af4f8e124ee660ec41dd99ff7e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Machine learning</topic><topic>Recurrent neural networks</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Maslennikov, Oleg V.</creatorcontrib><creatorcontrib>Gao, Chao</creatorcontrib><creatorcontrib>Nekorkin, Vladimir I.</creatorcontrib><collection>CrossRef</collection><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>MEDLINE - Academic</collection><jtitle>Chaos (Woodbury, N.Y.)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Maslennikov, Oleg V.</au><au>Gao, Chao</au><au>Nekorkin, Vladimir I.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns</atitle><jtitle>Chaos (Woodbury, N.Y.)</jtitle><date>2023-09</date><risdate>2023</risdate><volume>33</volume><issue>9</issue><issn>1054-1500</issn><eissn>1089-7682</eissn><coden>CHAOEH</coden><abstract>How complex patterns generated by neural systems are represented in individual neuronal activity is an essential problem in computational neuroscience as well as machine learning communities. Here, based on recurrent neural networks in the form of feedback reservoir computers, we show microscopic features resulting in generating spatiotemporal patterns including multicluster and chimera states. We show the effect of individual neural trajectories as well as whole-network activity distributions on exhibiting particular regimes. In addition, we address the question how trained output weights contribute to the autonomous multidimensional dynamics.</abstract><cop>Melville</cop><pub>American Institute of Physics</pub><doi>10.1063/5.0166359</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0003-0173-587X</orcidid><orcidid>https://orcid.org/0000-0002-5865-2285</orcidid><orcidid>https://orcid.org/0000-0002-8909-321X</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1054-1500 |
ispartof | Chaos (Woodbury, N.Y.), 2023-09, Vol.33 (9) |
issn | 1054-1500 1089-7682 |
language | eng |
recordid | cdi_crossref_primary_10_1063_5_0166359 |
source | AIP Journals Complete; Alma/SFX Local Collection |
subjects | Machine learning Recurrent neural networks |
title | Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-23T17%3A54%3A07IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Internal%20dynamics%20of%20recurrent%20neural%20networks%20trained%20to%20generate%20complex%20spatiotemporal%20patterns&rft.jtitle=Chaos%20(Woodbury,%20N.Y.)&rft.au=Maslennikov,%20Oleg%20V.&rft.date=2023-09&rft.volume=33&rft.issue=9&rft.issn=1054-1500&rft.eissn=1089-7682&rft.coden=CHAOEH&rft_id=info:doi/10.1063/5.0166359&rft_dat=%3Cproquest_cross%3E2865905635%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2865905635&rft_id=info:pmid/&rfr_iscdi=true |