Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models
In this paper, back propagation is reinvestigated for an efficient evaluation of the gradient in arbitrary interconnections of recurrent subsystems. It is shown that the error has to be back-propagated through the adjoint model of the system and that the gradient can only be obtained after a delay....
Gespeichert in:
Veröffentlicht in: | IEEE transactions on neural networks 1994-03, Vol.5 (2), p.213-228 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 228 |
---|---|
container_issue | 2 |
container_start_page | 213 |
container_title | IEEE transactions on neural networks |
container_volume | 5 |
creator | Srinivasan, B. Prasad, U.R. Rao, N.J. |
description | In this paper, back propagation is reinvestigated for an efficient evaluation of the gradient in arbitrary interconnections of recurrent subsystems. It is shown that the error has to be back-propagated through the adjoint model of the system and that the gradient can only be obtained after a delay. A faster version, accelerated back propagation, that eliminates this delay, is also developed. Various schemes including the sensitivity method are studied to update the weights of the network using these gradients. Motivated by the Lyapunov approach and the adjoint model, the predictive back propagation and its variant, targeted back propagation, are proposed. A further refinement, predictive back propagation with filtering is then developed, where the states of the model are also updated. The convergence of this scheme is assured. It is shown that it is sufficient to back propagate as many time steps as the order of the system for convergence. As a preamble, convergence of online batch and sample-wise updates in feedforward models is analyzed using the Lyapunov approach.< > |
doi_str_mv | 10.1109/72.279186 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_279186</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>279186</ieee_id><sourcerecordid>734269848</sourcerecordid><originalsourceid>FETCH-LOGICAL-c398t-d938d7cb0496dfd291d62a41ef65788086131267386f4c1e4e1cba36f54126663</originalsourceid><addsrcrecordid>eNqF0T1vFDEQBmALEZEQKNJSRK5AKTb4a_1RQgRJpEhpQr3y2eOLk137sHeL-_cx2hPpoLI188xIoxehM0ouKSXmq2KXTBmq5Rt0Qo2gHSGGv21_IvrOMKaO0ftanwihoifyHTqmmkmlDDtB9bt1z3hX8s5u7RxzwvNjycv2EVv_lGOaKw65tCLg6CHNMUS3uhxwymmMCWzBfp_sFB2u-zrDVPFSY9riAm4ppU3hBEuxI56yh7F-QEfBjhU-Ht5T9Ovnj4erm-7u_vr26ttd57jRc-cN1165DRFG-uCZoV4yKygE2SutiZaU03YH1zIIR0EAdRvLZehFK0vJT9GXdW877_cCdR6mWB2Mo02QlzooLpg0WugmP_9TMk2VJkT8H0reU85ZgxcrdCXXWiAMuxInW_YDJcOf0AbFhjW0Zs8PS5fNBP5VHlJq4NMKIgD8bR-mXwAQG5rB</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>26351332</pqid></control><display><type>article</type><title>Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models</title><source>IEEE Electronic Library (IEL)</source><creator>Srinivasan, B. ; Prasad, U.R. ; Rao, N.J.</creator><creatorcontrib>Srinivasan, B. ; Prasad, U.R. ; Rao, N.J.</creatorcontrib><description>In this paper, back propagation is reinvestigated for an efficient evaluation of the gradient in arbitrary interconnections of recurrent subsystems. It is shown that the error has to be back-propagated through the adjoint model of the system and that the gradient can only be obtained after a delay. A faster version, accelerated back propagation, that eliminates this delay, is also developed. Various schemes including the sensitivity method are studied to update the weights of the network using these gradients. Motivated by the Lyapunov approach and the adjoint model, the predictive back propagation and its variant, targeted back propagation, are proposed. A further refinement, predictive back propagation with filtering is then developed, where the states of the model are also updated. The convergence of this scheme is assured. It is shown that it is sufficient to back propagate as many time steps as the order of the system for convergence. As a preamble, convergence of online batch and sample-wise updates in feedforward models is analyzed using the Lyapunov approach.< ></description><identifier>ISSN: 1045-9227</identifier><identifier>EISSN: 1941-0093</identifier><identifier>DOI: 10.1109/72.279186</identifier><identifier>PMID: 18267792</identifier><identifier>CODEN: ITNNEP</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Acceleration ; Computational efficiency ; Convergence ; Delay ; Difference equations ; Multilayer perceptrons ; Neural networks ; Nonlinear dynamical systems ; Nonlinear systems ; Predictive models</subject><ispartof>IEEE transactions on neural networks, 1994-03, Vol.5 (2), p.213-228</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c398t-d938d7cb0496dfd291d62a41ef65788086131267386f4c1e4e1cba36f54126663</citedby><cites>FETCH-LOGICAL-c398t-d938d7cb0496dfd291d62a41ef65788086131267386f4c1e4e1cba36f54126663</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/279186$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/279186$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/18267792$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Srinivasan, B.</creatorcontrib><creatorcontrib>Prasad, U.R.</creatorcontrib><creatorcontrib>Rao, N.J.</creatorcontrib><title>Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models</title><title>IEEE transactions on neural networks</title><addtitle>TNN</addtitle><addtitle>IEEE Trans Neural Netw</addtitle><description>In this paper, back propagation is reinvestigated for an efficient evaluation of the gradient in arbitrary interconnections of recurrent subsystems. It is shown that the error has to be back-propagated through the adjoint model of the system and that the gradient can only be obtained after a delay. A faster version, accelerated back propagation, that eliminates this delay, is also developed. Various schemes including the sensitivity method are studied to update the weights of the network using these gradients. Motivated by the Lyapunov approach and the adjoint model, the predictive back propagation and its variant, targeted back propagation, are proposed. A further refinement, predictive back propagation with filtering is then developed, where the states of the model are also updated. The convergence of this scheme is assured. It is shown that it is sufficient to back propagate as many time steps as the order of the system for convergence. As a preamble, convergence of online batch and sample-wise updates in feedforward models is analyzed using the Lyapunov approach.< ></description><subject>Acceleration</subject><subject>Computational efficiency</subject><subject>Convergence</subject><subject>Delay</subject><subject>Difference equations</subject><subject>Multilayer perceptrons</subject><subject>Neural networks</subject><subject>Nonlinear dynamical systems</subject><subject>Nonlinear systems</subject><subject>Predictive models</subject><issn>1045-9227</issn><issn>1941-0093</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1994</creationdate><recordtype>article</recordtype><recordid>eNqF0T1vFDEQBmALEZEQKNJSRK5AKTb4a_1RQgRJpEhpQr3y2eOLk137sHeL-_cx2hPpoLI188xIoxehM0ouKSXmq2KXTBmq5Rt0Qo2gHSGGv21_IvrOMKaO0ftanwihoifyHTqmmkmlDDtB9bt1z3hX8s5u7RxzwvNjycv2EVv_lGOaKw65tCLg6CHNMUS3uhxwymmMCWzBfp_sFB2u-zrDVPFSY9riAm4ppU3hBEuxI56yh7F-QEfBjhU-Ht5T9Ovnj4erm-7u_vr26ttd57jRc-cN1165DRFG-uCZoV4yKygE2SutiZaU03YH1zIIR0EAdRvLZehFK0vJT9GXdW877_cCdR6mWB2Mo02QlzooLpg0WugmP_9TMk2VJkT8H0reU85ZgxcrdCXXWiAMuxInW_YDJcOf0AbFhjW0Zs8PS5fNBP5VHlJq4NMKIgD8bR-mXwAQG5rB</recordid><startdate>19940301</startdate><enddate>19940301</enddate><creator>Srinivasan, B.</creator><creator>Prasad, U.R.</creator><creator>Rao, N.J.</creator><general>IEEE</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope><scope>7SC</scope><scope>JQ2</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope></search><sort><creationdate>19940301</creationdate><title>Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models</title><author>Srinivasan, B. ; Prasad, U.R. ; Rao, N.J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c398t-d938d7cb0496dfd291d62a41ef65788086131267386f4c1e4e1cba36f54126663</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1994</creationdate><topic>Acceleration</topic><topic>Computational efficiency</topic><topic>Convergence</topic><topic>Delay</topic><topic>Difference equations</topic><topic>Multilayer perceptrons</topic><topic>Neural networks</topic><topic>Nonlinear dynamical systems</topic><topic>Nonlinear systems</topic><topic>Predictive models</topic><toplevel>online_resources</toplevel><creatorcontrib>Srinivasan, B.</creatorcontrib><creatorcontrib>Prasad, U.R.</creatorcontrib><creatorcontrib>Rao, N.J.</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Srinivasan, B.</au><au>Prasad, U.R.</au><au>Rao, N.J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models</atitle><jtitle>IEEE transactions on neural networks</jtitle><stitle>TNN</stitle><addtitle>IEEE Trans Neural Netw</addtitle><date>1994-03-01</date><risdate>1994</risdate><volume>5</volume><issue>2</issue><spage>213</spage><epage>228</epage><pages>213-228</pages><issn>1045-9227</issn><eissn>1941-0093</eissn><coden>ITNNEP</coden><abstract>In this paper, back propagation is reinvestigated for an efficient evaluation of the gradient in arbitrary interconnections of recurrent subsystems. It is shown that the error has to be back-propagated through the adjoint model of the system and that the gradient can only be obtained after a delay. A faster version, accelerated back propagation, that eliminates this delay, is also developed. Various schemes including the sensitivity method are studied to update the weights of the network using these gradients. Motivated by the Lyapunov approach and the adjoint model, the predictive back propagation and its variant, targeted back propagation, are proposed. A further refinement, predictive back propagation with filtering is then developed, where the states of the model are also updated. The convergence of this scheme is assured. It is shown that it is sufficient to back propagate as many time steps as the order of the system for convergence. As a preamble, convergence of online batch and sample-wise updates in feedforward models is analyzed using the Lyapunov approach.< ></abstract><cop>United States</cop><pub>IEEE</pub><pmid>18267792</pmid><doi>10.1109/72.279186</doi><tpages>16</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1045-9227 |
ispartof | IEEE transactions on neural networks, 1994-03, Vol.5 (2), p.213-228 |
issn | 1045-9227 1941-0093 |
language | eng |
recordid | cdi_ieee_primary_279186 |
source | IEEE Electronic Library (IEL) |
subjects | Acceleration Computational efficiency Convergence Delay Difference equations Multilayer perceptrons Neural networks Nonlinear dynamical systems Nonlinear systems Predictive models |
title | Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T23%3A17%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Back%20propagation%20through%20adjoints%20for%20the%20identification%20of%20nonlinear%20dynamic%20systems%20using%20recurrent%20neural%20models&rft.jtitle=IEEE%20transactions%20on%20neural%20networks&rft.au=Srinivasan,%20B.&rft.date=1994-03-01&rft.volume=5&rft.issue=2&rft.spage=213&rft.epage=228&rft.pages=213-228&rft.issn=1045-9227&rft.eissn=1941-0093&rft.coden=ITNNEP&rft_id=info:doi/10.1109/72.279186&rft_dat=%3Cproquest_RIE%3E734269848%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=26351332&rft_id=info:pmid/18267792&rft_ieee_id=279186&rfr_iscdi=true |