Supervised Learning in a Multilayer, Nonlinear Chemical Neural Network
The development of programmable or trainable molecular circuits is an important goal in the field of molecular programming. Multilayer, nonlinear, artificial neural networks are a powerful framework for implementing such functionality in a molecular system, as they are provably universal function ap...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2023-10, Vol.34 (10), p.7734-7745 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 7745 |
---|---|
container_issue | 10 |
container_start_page | 7734 |
container_title | IEEE transaction on neural networks and learning systems |
container_volume | 34 |
creator | Arredondo, David Lakin, Matthew R. |
description | The development of programmable or trainable molecular circuits is an important goal in the field of molecular programming. Multilayer, nonlinear, artificial neural networks are a powerful framework for implementing such functionality in a molecular system, as they are provably universal function approximators. Here, we present a design for multilayer chemical neural networks with a nonlinear hyperbolic tangent transfer function. We use a weight perturbation algorithm to train the neural network which uses a simple construction to directly approximate the loss derivatives required for training. We demonstrate the training of this system to learn all 16 two-input binary functions from a common starting point. This work thus introduces new capabilities in the field of adaptive and trainable chemical reaction network (CRN) design. It also opens the door to potential future experimental implementations, including DNA strand displacement reactions. |
doi_str_mv | 10.1109/TNNLS.2022.3146057 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9707599</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9707599</ieee_id><sourcerecordid>2873588429</sourcerecordid><originalsourceid>FETCH-LOGICAL-c351t-327d3f60ea9cd43484a8fc1dc2d7d80b0460f35cf578e4efc5298796c3e71adf3</originalsourceid><addsrcrecordid>eNpdkMtOwzAQRS0EolXpD4CEIrFhQYofSWwvUUUBKZRFi8TOcp0JuKRJsRtQ_x73QRd4M5bumRn7IHRO8IAQLG-n43E-GVBM6YCRJMMpP0JdSjIaUybE8eHO3zqo7_0chxOoLJGnqMNSwpjkuItGk3YJ7tt6KKIctKtt_R7ZOtLRc1utbKXX4G6icVNXtg5xNPyAhTW6isbQum1Z_TTu8wydlLry0N_XHnod3U-Hj3H-8vA0vMtjE1auYkZ5wcoMg5amSFgiEi1KQwpDC14IPMPhIyVLTZlyAQmUJqVScJkZBpzoomQ9dL2bu3TNVwt-pRbWG6gqXUPTekUzyglLOGUBvfqHzpvW1eF1igrOUiESKgNFd5RxjfcOSrV0dqHdWhGsNqLVVrTaiFZ70aHpcj-6nS2gOLT8aQ3AxQ6wAHCIQ8JTKdkvDQqAqg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2873588429</pqid></control><display><type>article</type><title>Supervised Learning in a Multilayer, Nonlinear Chemical Neural Network</title><source>IEEE</source><creator>Arredondo, David ; Lakin, Matthew R.</creator><creatorcontrib>Arredondo, David ; Lakin, Matthew R.</creatorcontrib><description>The development of programmable or trainable molecular circuits is an important goal in the field of molecular programming. Multilayer, nonlinear, artificial neural networks are a powerful framework for implementing such functionality in a molecular system, as they are provably universal function approximators. Here, we present a design for multilayer chemical neural networks with a nonlinear hyperbolic tangent transfer function. We use a weight perturbation algorithm to train the neural network which uses a simple construction to directly approximate the loss derivatives required for training. We demonstrate the training of this system to learn all 16 two-input binary functions from a common starting point. This work thus introduces new capabilities in the field of adaptive and trainable chemical reaction network (CRN) design. It also opens the door to potential future experimental implementations, including DNA strand displacement reactions.</description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2022.3146057</identifier><identifier>PMID: 35133970</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Artificial neural networks ; Biological neural networks ; Chemical reaction networks (CRNs) ; Chemical reactions ; Chemicals ; Clocks ; Computer architecture ; Deoxyribonucleic acid ; DNA ; Hyperbolic functions ; hyperbolic tangent ; Machine learning ; Multilayers ; Neural networks ; Neurons ; nonlinearity ; Perturbation ; Supervised learning ; Training ; Transfer functions</subject><ispartof>IEEE transaction on neural networks and learning systems, 2023-10, Vol.34 (10), p.7734-7745</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c351t-327d3f60ea9cd43484a8fc1dc2d7d80b0460f35cf578e4efc5298796c3e71adf3</citedby><cites>FETCH-LOGICAL-c351t-327d3f60ea9cd43484a8fc1dc2d7d80b0460f35cf578e4efc5298796c3e71adf3</cites><orcidid>0000-0003-2477-9162 ; 0000-0002-8516-4789</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9707599$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9707599$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35133970$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Arredondo, David</creatorcontrib><creatorcontrib>Lakin, Matthew R.</creatorcontrib><title>Supervised Learning in a Multilayer, Nonlinear Chemical Neural Network</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>The development of programmable or trainable molecular circuits is an important goal in the field of molecular programming. Multilayer, nonlinear, artificial neural networks are a powerful framework for implementing such functionality in a molecular system, as they are provably universal function approximators. Here, we present a design for multilayer chemical neural networks with a nonlinear hyperbolic tangent transfer function. We use a weight perturbation algorithm to train the neural network which uses a simple construction to directly approximate the loss derivatives required for training. We demonstrate the training of this system to learn all 16 two-input binary functions from a common starting point. This work thus introduces new capabilities in the field of adaptive and trainable chemical reaction network (CRN) design. It also opens the door to potential future experimental implementations, including DNA strand displacement reactions.</description><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Biological neural networks</subject><subject>Chemical reaction networks (CRNs)</subject><subject>Chemical reactions</subject><subject>Chemicals</subject><subject>Clocks</subject><subject>Computer architecture</subject><subject>Deoxyribonucleic acid</subject><subject>DNA</subject><subject>Hyperbolic functions</subject><subject>hyperbolic tangent</subject><subject>Machine learning</subject><subject>Multilayers</subject><subject>Neural networks</subject><subject>Neurons</subject><subject>nonlinearity</subject><subject>Perturbation</subject><subject>Supervised learning</subject><subject>Training</subject><subject>Transfer functions</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpdkMtOwzAQRS0EolXpD4CEIrFhQYofSWwvUUUBKZRFi8TOcp0JuKRJsRtQ_x73QRd4M5bumRn7IHRO8IAQLG-n43E-GVBM6YCRJMMpP0JdSjIaUybE8eHO3zqo7_0chxOoLJGnqMNSwpjkuItGk3YJ7tt6KKIctKtt_R7ZOtLRc1utbKXX4G6icVNXtg5xNPyAhTW6isbQum1Z_TTu8wydlLry0N_XHnod3U-Hj3H-8vA0vMtjE1auYkZ5wcoMg5amSFgiEi1KQwpDC14IPMPhIyVLTZlyAQmUJqVScJkZBpzoomQ9dL2bu3TNVwt-pRbWG6gqXUPTekUzyglLOGUBvfqHzpvW1eF1igrOUiESKgNFd5RxjfcOSrV0dqHdWhGsNqLVVrTaiFZ70aHpcj-6nS2gOLT8aQ3AxQ6wAHCIQ8JTKdkvDQqAqg</recordid><startdate>20231001</startdate><enddate>20231001</enddate><creator>Arredondo, David</creator><creator>Lakin, Matthew R.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-2477-9162</orcidid><orcidid>https://orcid.org/0000-0002-8516-4789</orcidid></search><sort><creationdate>20231001</creationdate><title>Supervised Learning in a Multilayer, Nonlinear Chemical Neural Network</title><author>Arredondo, David ; Lakin, Matthew R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c351t-327d3f60ea9cd43484a8fc1dc2d7d80b0460f35cf578e4efc5298796c3e71adf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Biological neural networks</topic><topic>Chemical reaction networks (CRNs)</topic><topic>Chemical reactions</topic><topic>Chemicals</topic><topic>Clocks</topic><topic>Computer architecture</topic><topic>Deoxyribonucleic acid</topic><topic>DNA</topic><topic>Hyperbolic functions</topic><topic>hyperbolic tangent</topic><topic>Machine learning</topic><topic>Multilayers</topic><topic>Neural networks</topic><topic>Neurons</topic><topic>nonlinearity</topic><topic>Perturbation</topic><topic>Supervised learning</topic><topic>Training</topic><topic>Transfer functions</topic><toplevel>online_resources</toplevel><creatorcontrib>Arredondo, David</creatorcontrib><creatorcontrib>Lakin, Matthew R.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005–Present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998–Present</collection><collection>IEEE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Arredondo, David</au><au>Lakin, Matthew R.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Supervised Learning in a Multilayer, Nonlinear Chemical Neural Network</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2023-10-01</date><risdate>2023</risdate><volume>34</volume><issue>10</issue><spage>7734</spage><epage>7745</epage><pages>7734-7745</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>The development of programmable or trainable molecular circuits is an important goal in the field of molecular programming. Multilayer, nonlinear, artificial neural networks are a powerful framework for implementing such functionality in a molecular system, as they are provably universal function approximators. Here, we present a design for multilayer chemical neural networks with a nonlinear hyperbolic tangent transfer function. We use a weight perturbation algorithm to train the neural network which uses a simple construction to directly approximate the loss derivatives required for training. We demonstrate the training of this system to learn all 16 two-input binary functions from a common starting point. This work thus introduces new capabilities in the field of adaptive and trainable chemical reaction network (CRN) design. It also opens the door to potential future experimental implementations, including DNA strand displacement reactions.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>35133970</pmid><doi>10.1109/TNNLS.2022.3146057</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-2477-9162</orcidid><orcidid>https://orcid.org/0000-0002-8516-4789</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2162-237X |
ispartof | IEEE transaction on neural networks and learning systems, 2023-10, Vol.34 (10), p.7734-7745 |
issn | 2162-237X 2162-2388 |
language | eng |
recordid | cdi_ieee_primary_9707599 |
source | IEEE |
subjects | Algorithms Artificial neural networks Biological neural networks Chemical reaction networks (CRNs) Chemical reactions Chemicals Clocks Computer architecture Deoxyribonucleic acid DNA Hyperbolic functions hyperbolic tangent Machine learning Multilayers Neural networks Neurons nonlinearity Perturbation Supervised learning Training Transfer functions |
title | Supervised Learning in a Multilayer, Nonlinear Chemical Neural Network |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T08%3A08%3A05IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Supervised%20Learning%20in%20a%20Multilayer,%20Nonlinear%20Chemical%20Neural%20Network&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Arredondo,%20David&rft.date=2023-10-01&rft.volume=34&rft.issue=10&rft.spage=7734&rft.epage=7745&rft.pages=7734-7745&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2022.3146057&rft_dat=%3Cproquest_RIE%3E2873588429%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2873588429&rft_id=info:pmid/35133970&rft_ieee_id=9707599&rfr_iscdi=true |