Continual neural networks
It is necessary to introduce many parameters describing the structure and input signal of a pattern recognition system during the construction of open-loop structures of multilayer neural networks in order to provide maximum probability of correct recognition in practice. The availability of a large...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1067 vol.2 |
---|---|
container_issue | |
container_start_page | 1056 |
container_title | |
container_volume | |
creator | Galushkin, A.I. |
description | It is necessary to introduce many parameters describing the structure and input signal of a pattern recognition system during the construction of open-loop structures of multilayer neural networks in order to provide maximum probability of correct recognition in practice. The availability of a large number of parameters, i.e., hundreds and thousands, poses some difficulties for learning and for the technical implementation of such networks. A transition to an attributes continuum and a continuum of neurons in the layer is considered for some specific neural network structures.< > |
doi_str_mv | 10.1109/RNNS.1992.268523 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_268523</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>268523</ieee_id><sourcerecordid>268523</sourcerecordid><originalsourceid>FETCH-LOGICAL-i89t-a86485abad2eb6e589d18d63e07e9e335038d6bd15ebbf7ed3e5f6b1b5ba761b3</originalsourceid><addsrcrecordid>eNotjsuKwkAQRRtkwOdeXfkDidVd9mspwRkFUVD30kVXID6iJBHx70fGOZvD2VyuEEMJqZTgp7vNZp9K71WqjNMKW6IL1gGCA49tMajrE7yZoUVlOmKU3cqmKB_hMin5Uf2ped6qc90XX3m41Dz4d08cvheHbJmstz-rbL5OCuebJDgzczpQiIrJsHY-ShcNMlj2jKgB30lRaibKLUdknRuSpClYIwl7YvyZLZj5eK-Ka6hex893_AXgfznW</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Continual neural networks</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Galushkin, A.I.</creator><creatorcontrib>Galushkin, A.I.</creatorcontrib><description>It is necessary to introduce many parameters describing the structure and input signal of a pattern recognition system during the construction of open-loop structures of multilayer neural networks in order to provide maximum probability of correct recognition in practice. The availability of a large number of parameters, i.e., hundreds and thousands, poses some difficulties for learning and for the technical implementation of such networks. A transition to an attributes continuum and a continuum of neurons in the layer is considered for some specific neural network structures.< ></description><identifier>ISBN: 0780308093</identifier><identifier>ISBN: 9780780308091</identifier><identifier>DOI: 10.1109/RNNS.1992.268523</identifier><language>eng</language><publisher>IEEE</publisher><subject>Artificial neural networks ; Concrete ; Image sampling ; Multi-layer neural network ; Neural networks ; Neurons ; Optical fiber networks ; Particle beam optics ; Pattern recognition ; Signal generators</subject><ispartof>[Proceedings] 1992 RNNS/IEEE Symposium on Neuroinformatics and Neurocomputers, 1992, p.1056-1067 vol.2</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/268523$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,4036,4037,27902,54895</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/268523$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Galushkin, A.I.</creatorcontrib><title>Continual neural networks</title><title>[Proceedings] 1992 RNNS/IEEE Symposium on Neuroinformatics and Neurocomputers</title><addtitle>RNNS</addtitle><description>It is necessary to introduce many parameters describing the structure and input signal of a pattern recognition system during the construction of open-loop structures of multilayer neural networks in order to provide maximum probability of correct recognition in practice. The availability of a large number of parameters, i.e., hundreds and thousands, poses some difficulties for learning and for the technical implementation of such networks. A transition to an attributes continuum and a continuum of neurons in the layer is considered for some specific neural network structures.< ></description><subject>Artificial neural networks</subject><subject>Concrete</subject><subject>Image sampling</subject><subject>Multi-layer neural network</subject><subject>Neural networks</subject><subject>Neurons</subject><subject>Optical fiber networks</subject><subject>Particle beam optics</subject><subject>Pattern recognition</subject><subject>Signal generators</subject><isbn>0780308093</isbn><isbn>9780780308091</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>1992</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNotjsuKwkAQRRtkwOdeXfkDidVd9mspwRkFUVD30kVXID6iJBHx70fGOZvD2VyuEEMJqZTgp7vNZp9K71WqjNMKW6IL1gGCA49tMajrE7yZoUVlOmKU3cqmKB_hMin5Uf2ped6qc90XX3m41Dz4d08cvheHbJmstz-rbL5OCuebJDgzczpQiIrJsHY-ShcNMlj2jKgB30lRaibKLUdknRuSpClYIwl7YvyZLZj5eK-Ka6hex893_AXgfznW</recordid><startdate>1992</startdate><enddate>1992</enddate><creator>Galushkin, A.I.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>1992</creationdate><title>Continual neural networks</title><author>Galushkin, A.I.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i89t-a86485abad2eb6e589d18d63e07e9e335038d6bd15ebbf7ed3e5f6b1b5ba761b3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>1992</creationdate><topic>Artificial neural networks</topic><topic>Concrete</topic><topic>Image sampling</topic><topic>Multi-layer neural network</topic><topic>Neural networks</topic><topic>Neurons</topic><topic>Optical fiber networks</topic><topic>Particle beam optics</topic><topic>Pattern recognition</topic><topic>Signal generators</topic><toplevel>online_resources</toplevel><creatorcontrib>Galushkin, A.I.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Galushkin, A.I.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Continual neural networks</atitle><btitle>[Proceedings] 1992 RNNS/IEEE Symposium on Neuroinformatics and Neurocomputers</btitle><stitle>RNNS</stitle><date>1992</date><risdate>1992</risdate><spage>1056</spage><epage>1067 vol.2</epage><pages>1056-1067 vol.2</pages><isbn>0780308093</isbn><isbn>9780780308091</isbn><abstract>It is necessary to introduce many parameters describing the structure and input signal of a pattern recognition system during the construction of open-loop structures of multilayer neural networks in order to provide maximum probability of correct recognition in practice. The availability of a large number of parameters, i.e., hundreds and thousands, poses some difficulties for learning and for the technical implementation of such networks. A transition to an attributes continuum and a continuum of neurons in the layer is considered for some specific neural network structures.< ></abstract><pub>IEEE</pub><doi>10.1109/RNNS.1992.268523</doi></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISBN: 0780308093 |
ispartof | [Proceedings] 1992 RNNS/IEEE Symposium on Neuroinformatics and Neurocomputers, 1992, p.1056-1067 vol.2 |
issn | |
language | eng |
recordid | cdi_ieee_primary_268523 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Artificial neural networks Concrete Image sampling Multi-layer neural network Neural networks Neurons Optical fiber networks Particle beam optics Pattern recognition Signal generators |
title | Continual neural networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T14%3A07%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Continual%20neural%20networks&rft.btitle=%5BProceedings%5D%201992%20RNNS/IEEE%20Symposium%20on%20Neuroinformatics%20and%20Neurocomputers&rft.au=Galushkin,%20A.I.&rft.date=1992&rft.spage=1056&rft.epage=1067%20vol.2&rft.pages=1056-1067%20vol.2&rft.isbn=0780308093&rft.isbn_list=9780780308091&rft_id=info:doi/10.1109/RNNS.1992.268523&rft_dat=%3Cieee_6IE%3E268523%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=268523&rfr_iscdi=true |