The Kernel Hopfield Memory Network
The kernel theory drawn from the work on learning machines is applied to the Hopfield neural network. This provides a new insight into the workings of the neural network as associative memory. The kernel “trick” defines an embedding of memory patterns into (higher or infinite dimensional) memory fea...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 764 |
---|---|
container_issue | |
container_start_page | 755 |
container_title | |
container_volume | |
creator | García, Cristina Moreno, José Alí |
description | The kernel theory drawn from the work on learning machines is applied to the Hopfield neural network. This provides a new insight into the workings of the neural network as associative memory. The kernel “trick” defines an embedding of memory patterns into (higher or infinite dimensional) memory feature vectors and the training of the network is carried out in this feature space. The generalization of the network by using the kernel theory improves its performance in three aspects. First, an adequate kernel selection enables the satisfaction of the condition that any set of memory patterns be attractors of the network dynamics. Second, the basins of attraction of the memory patterns are enhanced improving the recall capacity. Third, since the memory patterns are mapped into a higher dimensional feature space the memory capacity density is effectively increased. These aspects are experimentally demonstrated on sets of random memory patterns. |
doi_str_mv | 10.1007/978-3-540-30479-1_78 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>pascalfrancis_sprin</sourceid><recordid>TN_cdi_pascalfrancis_primary_16334395</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>16334395</sourcerecordid><originalsourceid>FETCH-LOGICAL-p228t-81cf29d03d6b34877e6a4f4eb8049fbb3437fd9f5ee0d0d1b3b6a4e117200f7d3</originalsourceid><addsrcrecordid>eNotkM1OwzAQhM2fRCl9Aw4REkfDrteJ4yOqCkUUuJSzldQ2lKZJZFdCfXuclr2MNPNppRnGbhDuEUA9aFVy4rkETiCV5mhUecKuKDkHQ52yERaInEjqMzZJ_JAJynWRn7NRogTXStIlm8T4A-lIoNI4YrfLb5e9utC6Jpt3vV-7xmZvbtuFffbudr9d2FyzC1810U3-dcw-n2bL6ZwvPp5fpo8L3gtR7niJKy-0BbJFTbJUyhWV9NLVJUjt6-SR8lb73DmwYLGmOgEOUQkAryyN2d3xb1_FVdX4ULWrdTR9WG-rsDdYpHak88SJIxdT1H65YOqu20SDYIa1TGpvyKT-5jCOGdaiP1jJVmU</addsrcrecordid><sourcetype>Index Database</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>The Kernel Hopfield Memory Network</title><source>Springer Books</source><creator>García, Cristina ; Moreno, José Alí</creator><contributor>Chopard, Bastien ; Sloot, Peter M. A. ; Hoekstra, Alfons G.</contributor><creatorcontrib>García, Cristina ; Moreno, José Alí ; Chopard, Bastien ; Sloot, Peter M. A. ; Hoekstra, Alfons G.</creatorcontrib><description>The kernel theory drawn from the work on learning machines is applied to the Hopfield neural network. This provides a new insight into the workings of the neural network as associative memory. The kernel “trick” defines an embedding of memory patterns into (higher or infinite dimensional) memory feature vectors and the training of the network is carried out in this feature space. The generalization of the network by using the kernel theory improves its performance in three aspects. First, an adequate kernel selection enables the satisfaction of the condition that any set of memory patterns be attractors of the network dynamics. Second, the basins of attraction of the memory patterns are enhanced improving the recall capacity. Third, since the memory patterns are mapped into a higher dimensional feature space the memory capacity density is effectively increased. These aspects are experimentally demonstrated on sets of random memory patterns.</description><identifier>ISSN: 0302-9743</identifier><identifier>ISBN: 9783540235965</identifier><identifier>ISBN: 3540235965</identifier><identifier>EISSN: 1611-3349</identifier><identifier>EISBN: 3540304797</identifier><identifier>EISBN: 9783540304791</identifier><identifier>DOI: 10.1007/978-3-540-30479-1_78</identifier><language>eng</language><publisher>Berlin, Heidelberg: Springer Berlin Heidelberg</publisher><subject>Applied sciences ; Automata. Abstract machines. Turing machines ; Cellular Automaton ; Computer science; control theory; systems ; Exact sciences and technology ; Feature Space ; High Dimensional Feature Space ; Processing Unit ; Synaptic Weight ; Theoretical computing</subject><ispartof>Cellular Automata, 2004, p.755-764</ispartof><rights>Springer-Verlag Berlin Heidelberg 2004</rights><rights>2005 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/978-3-540-30479-1_78$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/978-3-540-30479-1_78$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>309,310,775,776,780,785,786,789,4035,4036,27904,38234,41421,42490</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=16334395$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><contributor>Chopard, Bastien</contributor><contributor>Sloot, Peter M. A.</contributor><contributor>Hoekstra, Alfons G.</contributor><creatorcontrib>García, Cristina</creatorcontrib><creatorcontrib>Moreno, José Alí</creatorcontrib><title>The Kernel Hopfield Memory Network</title><title>Cellular Automata</title><description>The kernel theory drawn from the work on learning machines is applied to the Hopfield neural network. This provides a new insight into the workings of the neural network as associative memory. The kernel “trick” defines an embedding of memory patterns into (higher or infinite dimensional) memory feature vectors and the training of the network is carried out in this feature space. The generalization of the network by using the kernel theory improves its performance in three aspects. First, an adequate kernel selection enables the satisfaction of the condition that any set of memory patterns be attractors of the network dynamics. Second, the basins of attraction of the memory patterns are enhanced improving the recall capacity. Third, since the memory patterns are mapped into a higher dimensional feature space the memory capacity density is effectively increased. These aspects are experimentally demonstrated on sets of random memory patterns.</description><subject>Applied sciences</subject><subject>Automata. Abstract machines. Turing machines</subject><subject>Cellular Automaton</subject><subject>Computer science; control theory; systems</subject><subject>Exact sciences and technology</subject><subject>Feature Space</subject><subject>High Dimensional Feature Space</subject><subject>Processing Unit</subject><subject>Synaptic Weight</subject><subject>Theoretical computing</subject><issn>0302-9743</issn><issn>1611-3349</issn><isbn>9783540235965</isbn><isbn>3540235965</isbn><isbn>3540304797</isbn><isbn>9783540304791</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2004</creationdate><recordtype>conference_proceeding</recordtype><recordid>eNotkM1OwzAQhM2fRCl9Aw4REkfDrteJ4yOqCkUUuJSzldQ2lKZJZFdCfXuclr2MNPNppRnGbhDuEUA9aFVy4rkETiCV5mhUecKuKDkHQ52yERaInEjqMzZJ_JAJynWRn7NRogTXStIlm8T4A-lIoNI4YrfLb5e9utC6Jpt3vV-7xmZvbtuFffbudr9d2FyzC1810U3-dcw-n2bL6ZwvPp5fpo8L3gtR7niJKy-0BbJFTbJUyhWV9NLVJUjt6-SR8lb73DmwYLGmOgEOUQkAryyN2d3xb1_FVdX4ULWrdTR9WG-rsDdYpHak88SJIxdT1H65YOqu20SDYIa1TGpvyKT-5jCOGdaiP1jJVmU</recordid><startdate>2004</startdate><enddate>2004</enddate><creator>García, Cristina</creator><creator>Moreno, José Alí</creator><general>Springer Berlin Heidelberg</general><general>Springer</general><scope>IQODW</scope></search><sort><creationdate>2004</creationdate><title>The Kernel Hopfield Memory Network</title><author>García, Cristina ; Moreno, José Alí</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p228t-81cf29d03d6b34877e6a4f4eb8049fbb3437fd9f5ee0d0d1b3b6a4e117200f7d3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2004</creationdate><topic>Applied sciences</topic><topic>Automata. Abstract machines. Turing machines</topic><topic>Cellular Automaton</topic><topic>Computer science; control theory; systems</topic><topic>Exact sciences and technology</topic><topic>Feature Space</topic><topic>High Dimensional Feature Space</topic><topic>Processing Unit</topic><topic>Synaptic Weight</topic><topic>Theoretical computing</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>García, Cristina</creatorcontrib><creatorcontrib>Moreno, José Alí</creatorcontrib><collection>Pascal-Francis</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>García, Cristina</au><au>Moreno, José Alí</au><au>Chopard, Bastien</au><au>Sloot, Peter M. A.</au><au>Hoekstra, Alfons G.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>The Kernel Hopfield Memory Network</atitle><btitle>Cellular Automata</btitle><date>2004</date><risdate>2004</risdate><spage>755</spage><epage>764</epage><pages>755-764</pages><issn>0302-9743</issn><eissn>1611-3349</eissn><isbn>9783540235965</isbn><isbn>3540235965</isbn><eisbn>3540304797</eisbn><eisbn>9783540304791</eisbn><abstract>The kernel theory drawn from the work on learning machines is applied to the Hopfield neural network. This provides a new insight into the workings of the neural network as associative memory. The kernel “trick” defines an embedding of memory patterns into (higher or infinite dimensional) memory feature vectors and the training of the network is carried out in this feature space. The generalization of the network by using the kernel theory improves its performance in three aspects. First, an adequate kernel selection enables the satisfaction of the condition that any set of memory patterns be attractors of the network dynamics. Second, the basins of attraction of the memory patterns are enhanced improving the recall capacity. Third, since the memory patterns are mapped into a higher dimensional feature space the memory capacity density is effectively increased. These aspects are experimentally demonstrated on sets of random memory patterns.</abstract><cop>Berlin, Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/978-3-540-30479-1_78</doi><tpages>10</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0302-9743 |
ispartof | Cellular Automata, 2004, p.755-764 |
issn | 0302-9743 1611-3349 |
language | eng |
recordid | cdi_pascalfrancis_primary_16334395 |
source | Springer Books |
subjects | Applied sciences Automata. Abstract machines. Turing machines Cellular Automaton Computer science control theory systems Exact sciences and technology Feature Space High Dimensional Feature Space Processing Unit Synaptic Weight Theoretical computing |
title | The Kernel Hopfield Memory Network |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T23%3A23%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-pascalfrancis_sprin&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=The%20Kernel%20Hopfield%20Memory%20Network&rft.btitle=Cellular%20Automata&rft.au=Garc%C3%ADa,%20Cristina&rft.date=2004&rft.spage=755&rft.epage=764&rft.pages=755-764&rft.issn=0302-9743&rft.eissn=1611-3349&rft.isbn=9783540235965&rft.isbn_list=3540235965&rft_id=info:doi/10.1007/978-3-540-30479-1_78&rft_dat=%3Cpascalfrancis_sprin%3E16334395%3C/pascalfrancis_sprin%3E%3Curl%3E%3C/url%3E&rft.eisbn=3540304797&rft.eisbn_list=9783540304791&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |