Synthesis of cascade recurrent neural networks using feedforward generalization properties

This paper presents a new analysis and synthesis technique for a class of recurrent networks known as a Cascade Recurrent Network (CRN). In this technique, a feedforward (FF) sub-network is used with synchronous feedback to implement associative memory (AM). FF network mapping properties are shown t...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Information sciences 1998-07, Vol.108 (1), p.207-217
Hauptverfasser: Shaaban, Khaled M., Schalkoff, Robert J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 217
container_issue 1
container_start_page 207
container_title Information sciences
container_volume 108
creator Shaaban, Khaled M.
Schalkoff, Robert J.
description This paper presents a new analysis and synthesis technique for a class of recurrent networks known as a Cascade Recurrent Network (CRN). In this technique, a feedforward (FF) sub-network is used with synchronous feedback to implement associative memory (AM). FF network mapping properties are shown to determine CRN stability, and, on the basis of this stability-mapping relation, a new synthesis technique is given. This technique utilizes the optimization of the FF mapping sub-network generalization as a synthesis procedure for CRN. Sample results are shown.
doi_str_mv 10.1016/S0020-0255(97)10060-3
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_26797991</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0020025597100603</els_id><sourcerecordid>26797991</sourcerecordid><originalsourceid>FETCH-LOGICAL-c338t-9ccb7c263db757081a9b10a3070a5b5a5409e8d58dbc838ad593e9071bc57c7a3</originalsourceid><addsrcrecordid>eNqFkD1PwzAURS0EEqXwE5AyIRgCz3EdxxNCFV9SJYbCwmI59ksxtHaxE6ry60lbxMp0l3Pv0zuEnFK4pEDLqylAATkUnJ9LcUEBSsjZHhnQShR5WUi6TwZ_yCE5SukdAEaiLAfkdbr27Rsml7LQZEYnoy1mEU0XI_o289hFPe-jXYX4kbIuOT_LGkTbhLjS0WYz9Ngj7lu3LvhsGcMSY-swHZODRs8TnvzmkLzc3T6PH_LJ0_3j-GaSG8aqNpfG1MIUJbO14AIqqmVNQTMQoHnNNR-BxMryytamYpW2XDKUIGhtuDBCsyE52-32pz87TK1auGRwPtceQ5dUUQoppKQ9yHegiSGliI1aRrfQca0oqI1JtTWpNpqUFGprUrG-d73rYf_Fl8OoknHoDVrXi2qVDe6fhR-6AXz0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>26797991</pqid></control><display><type>article</type><title>Synthesis of cascade recurrent neural networks using feedforward generalization properties</title><source>Elsevier ScienceDirect Journals</source><creator>Shaaban, Khaled M. ; Schalkoff, Robert J.</creator><creatorcontrib>Shaaban, Khaled M. ; Schalkoff, Robert J.</creatorcontrib><description>This paper presents a new analysis and synthesis technique for a class of recurrent networks known as a Cascade Recurrent Network (CRN). In this technique, a feedforward (FF) sub-network is used with synchronous feedback to implement associative memory (AM). FF network mapping properties are shown to determine CRN stability, and, on the basis of this stability-mapping relation, a new synthesis technique is given. This technique utilizes the optimization of the FF mapping sub-network generalization as a synthesis procedure for CRN. Sample results are shown.</description><identifier>ISSN: 0020-0255</identifier><identifier>EISSN: 1872-6291</identifier><identifier>DOI: 10.1016/S0020-0255(97)10060-3</identifier><language>eng</language><publisher>Elsevier Inc</publisher><subject>Associative memory ; Generalization ; Recurrent networks</subject><ispartof>Information sciences, 1998-07, Vol.108 (1), p.207-217</ispartof><rights>1998</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c338t-9ccb7c263db757081a9b10a3070a5b5a5409e8d58dbc838ad593e9071bc57c7a3</citedby><cites>FETCH-LOGICAL-c338t-9ccb7c263db757081a9b10a3070a5b5a5409e8d58dbc838ad593e9071bc57c7a3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0020025597100603$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids></links><search><creatorcontrib>Shaaban, Khaled M.</creatorcontrib><creatorcontrib>Schalkoff, Robert J.</creatorcontrib><title>Synthesis of cascade recurrent neural networks using feedforward generalization properties</title><title>Information sciences</title><description>This paper presents a new analysis and synthesis technique for a class of recurrent networks known as a Cascade Recurrent Network (CRN). In this technique, a feedforward (FF) sub-network is used with synchronous feedback to implement associative memory (AM). FF network mapping properties are shown to determine CRN stability, and, on the basis of this stability-mapping relation, a new synthesis technique is given. This technique utilizes the optimization of the FF mapping sub-network generalization as a synthesis procedure for CRN. Sample results are shown.</description><subject>Associative memory</subject><subject>Generalization</subject><subject>Recurrent networks</subject><issn>0020-0255</issn><issn>1872-6291</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1998</creationdate><recordtype>article</recordtype><recordid>eNqFkD1PwzAURS0EEqXwE5AyIRgCz3EdxxNCFV9SJYbCwmI59ksxtHaxE6ry60lbxMp0l3Pv0zuEnFK4pEDLqylAATkUnJ9LcUEBSsjZHhnQShR5WUi6TwZ_yCE5SukdAEaiLAfkdbr27Rsml7LQZEYnoy1mEU0XI_o289hFPe-jXYX4kbIuOT_LGkTbhLjS0WYz9Ngj7lu3LvhsGcMSY-swHZODRs8TnvzmkLzc3T6PH_LJ0_3j-GaSG8aqNpfG1MIUJbO14AIqqmVNQTMQoHnNNR-BxMryytamYpW2XDKUIGhtuDBCsyE52-32pz87TK1auGRwPtceQ5dUUQoppKQ9yHegiSGliI1aRrfQca0oqI1JtTWpNpqUFGprUrG-d73rYf_Fl8OoknHoDVrXi2qVDe6fhR-6AXz0</recordid><startdate>19980701</startdate><enddate>19980701</enddate><creator>Shaaban, Khaled M.</creator><creator>Schalkoff, Robert J.</creator><general>Elsevier Inc</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>19980701</creationdate><title>Synthesis of cascade recurrent neural networks using feedforward generalization properties</title><author>Shaaban, Khaled M. ; Schalkoff, Robert J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c338t-9ccb7c263db757081a9b10a3070a5b5a5409e8d58dbc838ad593e9071bc57c7a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1998</creationdate><topic>Associative memory</topic><topic>Generalization</topic><topic>Recurrent networks</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Shaaban, Khaled M.</creatorcontrib><creatorcontrib>Schalkoff, Robert J.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Information sciences</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shaaban, Khaled M.</au><au>Schalkoff, Robert J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Synthesis of cascade recurrent neural networks using feedforward generalization properties</atitle><jtitle>Information sciences</jtitle><date>1998-07-01</date><risdate>1998</risdate><volume>108</volume><issue>1</issue><spage>207</spage><epage>217</epage><pages>207-217</pages><issn>0020-0255</issn><eissn>1872-6291</eissn><abstract>This paper presents a new analysis and synthesis technique for a class of recurrent networks known as a Cascade Recurrent Network (CRN). In this technique, a feedforward (FF) sub-network is used with synchronous feedback to implement associative memory (AM). FF network mapping properties are shown to determine CRN stability, and, on the basis of this stability-mapping relation, a new synthesis technique is given. This technique utilizes the optimization of the FF mapping sub-network generalization as a synthesis procedure for CRN. Sample results are shown.</abstract><pub>Elsevier Inc</pub><doi>10.1016/S0020-0255(97)10060-3</doi><tpages>11</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0020-0255
ispartof Information sciences, 1998-07, Vol.108 (1), p.207-217
issn 0020-0255
1872-6291
language eng
recordid cdi_proquest_miscellaneous_26797991
source Elsevier ScienceDirect Journals
subjects Associative memory
Generalization
Recurrent networks
title Synthesis of cascade recurrent neural networks using feedforward generalization properties
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T17%3A28%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Synthesis%20of%20cascade%20recurrent%20neural%20networks%20using%20feedforward%20generalization%20properties&rft.jtitle=Information%20sciences&rft.au=Shaaban,%20Khaled%20M.&rft.date=1998-07-01&rft.volume=108&rft.issue=1&rft.spage=207&rft.epage=217&rft.pages=207-217&rft.issn=0020-0255&rft.eissn=1872-6291&rft_id=info:doi/10.1016/S0020-0255(97)10060-3&rft_dat=%3Cproquest_cross%3E26797991%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=26797991&rft_id=info:pmid/&rft_els_id=S0020025597100603&rfr_iscdi=true