Structural stability of unsupervised learning in feedback neural networks

Structural stability is proved for a large class of unsupervised nonlinear feedback neural networks, adaptive bidirectional associative memory (ABAM) models. The approach extends the ABAM models to the random-process domain as systems of stochastic differential equations and appends scaled Brownian...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on automatic control 1991-07, Vol.36 (7), p.785-792
1. Verfasser: Kosko, B.A.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 792
container_issue 7
container_start_page 785
container_title IEEE transactions on automatic control
container_volume 36
creator Kosko, B.A.
description Structural stability is proved for a large class of unsupervised nonlinear feedback neural networks, adaptive bidirectional associative memory (ABAM) models. The approach extends the ABAM models to the random-process domain as systems of stochastic differential equations and appends scaled Brownian diffusions. It is also proved that this much larger family of models, random ABAM (RABAM) models, is globally stable. Intuitively, RABAM equilibria equal ABAM equilibria that vibrate randomly. The ABAM family includes many unsupervised feedback and feedforward neural models. All RABAM models permit Brownian annealing. The RABAM noise suppression theorem characterizes RABAM system vibration. The mean-squared activation and synaptic velocities decrease exponentially to their lower hounds, the respective temperature-scaled noise variances. The many neuronal and synaptic parameters missing from such neural network models are included, but as net random unmodeled effects. They do not affect the structure of real-time global computations.< >
doi_str_mv 10.1109/9.85058
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_28614830</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>85058</ieee_id><sourcerecordid>29052627</sourcerecordid><originalsourceid>FETCH-LOGICAL-c429t-6c46853383fc5d36f2bd629475cef556dba73e44c9a89740fa64cb01f0b5e9723</originalsourceid><addsrcrecordid>eNqF0DtPwzAQwHELgUQpiJktA4Ipxe_YI6p4VKrEAMyR45yRaeoUOwH125M-1LXTybqf_8MhdE3whBCsH_RECSzUCRoRIVROBWWnaIQxUbmmSp6ji5S-h6fknIzQ7L2Lve36aJosdabyje_WWeuyPqR-BfHXJ6izBkwMPnxlPmQOoK6MXWQBtr8CdH9tXKRLdOZMk-BqP8fo8_npY_qaz99eZtPHeW451V0uLZdKMKaYs6Jm0tGqllTzQlhwQsihXTDg3GqjdMGxM5LbChOHKwG6oGyM7nbdVWx_ekhdufTJQtOYAG2fSqqxoJIWx6GShCuGj0PBC4qLTfF-B21sU4rgylX0SxPXJcHl5vilLrfHH-TtPmmSNY2LJlifDlyQTVMM7GbHPAActrvEP4pzi0I</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>25472077</pqid></control><display><type>article</type><title>Structural stability of unsupervised learning in feedback neural networks</title><source>IEEE Electronic Library (IEL)</source><creator>Kosko, B.A.</creator><creatorcontrib>Kosko, B.A.</creatorcontrib><description>Structural stability is proved for a large class of unsupervised nonlinear feedback neural networks, adaptive bidirectional associative memory (ABAM) models. The approach extends the ABAM models to the random-process domain as systems of stochastic differential equations and appends scaled Brownian diffusions. It is also proved that this much larger family of models, random ABAM (RABAM) models, is globally stable. Intuitively, RABAM equilibria equal ABAM equilibria that vibrate randomly. The ABAM family includes many unsupervised feedback and feedforward neural models. All RABAM models permit Brownian annealing. The RABAM noise suppression theorem characterizes RABAM system vibration. The mean-squared activation and synaptic velocities decrease exponentially to their lower hounds, the respective temperature-scaled noise variances. The many neuronal and synaptic parameters missing from such neural network models are included, but as net random unmodeled effects. They do not affect the structure of real-time global computations.&lt; &gt;</description><identifier>ISSN: 0018-9286</identifier><identifier>EISSN: 1558-2523</identifier><identifier>DOI: 10.1109/9.85058</identifier><identifier>CODEN: IETAA9</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Applied sciences ; Biological system modeling ; Calculus ; Differential equations ; Exact sciences and technology ; Information, signal and communications theory ; Intelligent networks ; Miscellaneous ; Neural networks ; Neurofeedback ; Signal processing ; Stability ; Stochastic processes ; Structural engineering ; Telecommunications and information theory ; Unsupervised learning</subject><ispartof>IEEE transactions on automatic control, 1991-07, Vol.36 (7), p.785-792</ispartof><rights>1992 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c429t-6c46853383fc5d36f2bd629475cef556dba73e44c9a89740fa64cb01f0b5e9723</citedby><cites>FETCH-LOGICAL-c429t-6c46853383fc5d36f2bd629475cef556dba73e44c9a89740fa64cb01f0b5e9723</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/85058$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27923,27924,54757</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/85058$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=5125475$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Kosko, B.A.</creatorcontrib><title>Structural stability of unsupervised learning in feedback neural networks</title><title>IEEE transactions on automatic control</title><addtitle>TAC</addtitle><description>Structural stability is proved for a large class of unsupervised nonlinear feedback neural networks, adaptive bidirectional associative memory (ABAM) models. The approach extends the ABAM models to the random-process domain as systems of stochastic differential equations and appends scaled Brownian diffusions. It is also proved that this much larger family of models, random ABAM (RABAM) models, is globally stable. Intuitively, RABAM equilibria equal ABAM equilibria that vibrate randomly. The ABAM family includes many unsupervised feedback and feedforward neural models. All RABAM models permit Brownian annealing. The RABAM noise suppression theorem characterizes RABAM system vibration. The mean-squared activation and synaptic velocities decrease exponentially to their lower hounds, the respective temperature-scaled noise variances. The many neuronal and synaptic parameters missing from such neural network models are included, but as net random unmodeled effects. They do not affect the structure of real-time global computations.&lt; &gt;</description><subject>Applied sciences</subject><subject>Biological system modeling</subject><subject>Calculus</subject><subject>Differential equations</subject><subject>Exact sciences and technology</subject><subject>Information, signal and communications theory</subject><subject>Intelligent networks</subject><subject>Miscellaneous</subject><subject>Neural networks</subject><subject>Neurofeedback</subject><subject>Signal processing</subject><subject>Stability</subject><subject>Stochastic processes</subject><subject>Structural engineering</subject><subject>Telecommunications and information theory</subject><subject>Unsupervised learning</subject><issn>0018-9286</issn><issn>1558-2523</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1991</creationdate><recordtype>article</recordtype><recordid>eNqF0DtPwzAQwHELgUQpiJktA4Ipxe_YI6p4VKrEAMyR45yRaeoUOwH125M-1LXTybqf_8MhdE3whBCsH_RECSzUCRoRIVROBWWnaIQxUbmmSp6ji5S-h6fknIzQ7L2Lve36aJosdabyje_WWeuyPqR-BfHXJ6izBkwMPnxlPmQOoK6MXWQBtr8CdH9tXKRLdOZMk-BqP8fo8_npY_qaz99eZtPHeW451V0uLZdKMKaYs6Jm0tGqllTzQlhwQsihXTDg3GqjdMGxM5LbChOHKwG6oGyM7nbdVWx_ekhdufTJQtOYAG2fSqqxoJIWx6GShCuGj0PBC4qLTfF-B21sU4rgylX0SxPXJcHl5vilLrfHH-TtPmmSNY2LJlifDlyQTVMM7GbHPAActrvEP4pzi0I</recordid><startdate>19910701</startdate><enddate>19910701</enddate><creator>Kosko, B.A.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>FR3</scope><scope>JQ2</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>19910701</creationdate><title>Structural stability of unsupervised learning in feedback neural networks</title><author>Kosko, B.A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c429t-6c46853383fc5d36f2bd629475cef556dba73e44c9a89740fa64cb01f0b5e9723</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1991</creationdate><topic>Applied sciences</topic><topic>Biological system modeling</topic><topic>Calculus</topic><topic>Differential equations</topic><topic>Exact sciences and technology</topic><topic>Information, signal and communications theory</topic><topic>Intelligent networks</topic><topic>Miscellaneous</topic><topic>Neural networks</topic><topic>Neurofeedback</topic><topic>Signal processing</topic><topic>Stability</topic><topic>Stochastic processes</topic><topic>Structural engineering</topic><topic>Telecommunications and information theory</topic><topic>Unsupervised learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kosko, B.A.</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on automatic control</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kosko, B.A.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Structural stability of unsupervised learning in feedback neural networks</atitle><jtitle>IEEE transactions on automatic control</jtitle><stitle>TAC</stitle><date>1991-07-01</date><risdate>1991</risdate><volume>36</volume><issue>7</issue><spage>785</spage><epage>792</epage><pages>785-792</pages><issn>0018-9286</issn><eissn>1558-2523</eissn><coden>IETAA9</coden><abstract>Structural stability is proved for a large class of unsupervised nonlinear feedback neural networks, adaptive bidirectional associative memory (ABAM) models. The approach extends the ABAM models to the random-process domain as systems of stochastic differential equations and appends scaled Brownian diffusions. It is also proved that this much larger family of models, random ABAM (RABAM) models, is globally stable. Intuitively, RABAM equilibria equal ABAM equilibria that vibrate randomly. The ABAM family includes many unsupervised feedback and feedforward neural models. All RABAM models permit Brownian annealing. The RABAM noise suppression theorem characterizes RABAM system vibration. The mean-squared activation and synaptic velocities decrease exponentially to their lower hounds, the respective temperature-scaled noise variances. The many neuronal and synaptic parameters missing from such neural network models are included, but as net random unmodeled effects. They do not affect the structure of real-time global computations.&lt; &gt;</abstract><cop>New York, NY</cop><pub>IEEE</pub><doi>10.1109/9.85058</doi><tpages>8</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0018-9286
ispartof IEEE transactions on automatic control, 1991-07, Vol.36 (7), p.785-792
issn 0018-9286
1558-2523
language eng
recordid cdi_proquest_miscellaneous_28614830
source IEEE Electronic Library (IEL)
subjects Applied sciences
Biological system modeling
Calculus
Differential equations
Exact sciences and technology
Information, signal and communications theory
Intelligent networks
Miscellaneous
Neural networks
Neurofeedback
Signal processing
Stability
Stochastic processes
Structural engineering
Telecommunications and information theory
Unsupervised learning
title Structural stability of unsupervised learning in feedback neural networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T13%3A50%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Structural%20stability%20of%20unsupervised%20learning%20in%20feedback%20neural%20networks&rft.jtitle=IEEE%20transactions%20on%20automatic%20control&rft.au=Kosko,%20B.A.&rft.date=1991-07-01&rft.volume=36&rft.issue=7&rft.spage=785&rft.epage=792&rft.pages=785-792&rft.issn=0018-9286&rft.eissn=1558-2523&rft.coden=IETAA9&rft_id=info:doi/10.1109/9.85058&rft_dat=%3Cproquest_RIE%3E29052627%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=25472077&rft_id=info:pmid/&rft_ieee_id=85058&rfr_iscdi=true