Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks
This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical o.ine grammatical inference with neural networks. The results obtained show that the performance of recurrent n...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Buchkapitel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 724 |
---|---|
container_issue | |
container_start_page | 719 |
container_title | |
container_volume | 2130 |
creator | Pérez-Ortiz, Juan Antonio Calera-Rubio, Jorge Forcada, Mikel L. |
description | This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical o.ine grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even from some chaotic sources. When predicting texts in human language, however, dynamics seem to be too complex to be correctly learned in real-time by the net. Two algorithms are considered for network training: real-time recurrent learning and the decoupled extended Kalman filter. |
doi_str_mv | 10.1007/3-540-44668-0_100 |
format | Book Chapter |
fullrecord | <record><control><sourceid>proquest_pasca</sourceid><recordid>TN_cdi_pascalfrancis_primary_14045253</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>EBC3073004_106_721</sourcerecordid><originalsourceid>FETCH-LOGICAL-p313t-ab7b35ccb4c49d7c5ca825abd750d37197af5dfcf3f3ce0238a30de45345e9663</originalsourceid><addsrcrecordid>eNotkEtPwzAQhM1TlMIP4JYLR4PttePkiHhLFVS0nC3H2YAhTYqdquq_x5TuZaTZnZXmI-SCsyvOmL4GqiSjUuZ5QZlJ1h45hWRtHbZPRjznnALI8mC3ELLI1SEZMWCCllrCMRmVqlBCAhQn5DzGL5YGBJeiGJHpa9f6DrPZZlH1rXd0hj8r7Bxm04C1d4Pvu2zth8_szkcXcEA69wvM3tCtQsBuyF5wFWybZFj34TuekaPGthHPdzom7w_389snOnl9fL69mdAlcBiorXQFyrlKOlnW2ilnC6FsVWvFatC81LZRdeMaaMAhE1BYYDVKBVJhmecwJpf_f5c2Ots2wXbOR7MMfmHDxvCEQgkF6e76_y6mVfeBwVR9_x0Tyz-c2oBJ1MyWp9laKSF3n0OfWMTB4F_EpbKpqPu0ywFDNMA0MCZTJDdacPgFhj16EQ</addsrcrecordid><sourcetype>Index Database</sourcetype><iscdi>true</iscdi><recordtype>book_chapter</recordtype><pqid>EBC3073004_106_721</pqid></control><display><type>book_chapter</type><title>Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks</title><source>Springer Books</source><creator>Pérez-Ortiz, Juan Antonio ; Calera-Rubio, Jorge ; Forcada, Mikel L.</creator><contributor>Dorffner, Georg ; Bischof, Horst ; Hornik, Kurt ; Hornik, Kurt ; Dorffner, Georg ; Bischof, Horst</contributor><creatorcontrib>Pérez-Ortiz, Juan Antonio ; Calera-Rubio, Jorge ; Forcada, Mikel L. ; Dorffner, Georg ; Bischof, Horst ; Hornik, Kurt ; Hornik, Kurt ; Dorffner, Georg ; Bischof, Horst</creatorcontrib><description>This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical o.ine grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even from some chaotic sources. When predicting texts in human language, however, dynamics seem to be too complex to be correctly learned in real-time by the net. Two algorithms are considered for network training: real-time recurrent learning and the decoupled extended Kalman filter.</description><identifier>ISSN: 0302-9743</identifier><identifier>ISBN: 3540424865</identifier><identifier>ISBN: 9783540424864</identifier><identifier>EISSN: 1611-3349</identifier><identifier>EISBN: 3540446680</identifier><identifier>EISBN: 9783540446682</identifier><identifier>DOI: 10.1007/3-540-44668-0_100</identifier><identifier>OCLC: 958524338</identifier><identifier>LCCallNum: Q334-342</identifier><language>eng</language><publisher>Germany: Springer Berlin / Heidelberg</publisher><subject>Applied sciences ; Artificial intelligence ; Computer science; control theory; systems ; Connectionism. Neural networks ; Exact sciences and technology</subject><ispartof>Artificial Neural Networks - ICANN 2001, 2001, Vol.2130, p.719-724</ispartof><rights>Springer-Verlag Berlin Heidelberg 2001</rights><rights>2002 INIST-CNRS</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><relation>Lecture Notes in Computer Science</relation></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Uhttps://ebookcentral.proquest.com/covers/3073004-l.jpg</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/3-540-44668-0_100$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/3-540-44668-0_100$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>309,310,779,780,784,789,790,793,4050,4051,27925,38255,41442,42511</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=14045253$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><contributor>Dorffner, Georg</contributor><contributor>Bischof, Horst</contributor><contributor>Hornik, Kurt</contributor><contributor>Hornik, Kurt</contributor><contributor>Dorffner, Georg</contributor><contributor>Bischof, Horst</contributor><creatorcontrib>Pérez-Ortiz, Juan Antonio</creatorcontrib><creatorcontrib>Calera-Rubio, Jorge</creatorcontrib><creatorcontrib>Forcada, Mikel L.</creatorcontrib><title>Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks</title><title>Artificial Neural Networks - ICANN 2001</title><description>This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical o.ine grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even from some chaotic sources. When predicting texts in human language, however, dynamics seem to be too complex to be correctly learned in real-time by the net. Two algorithms are considered for network training: real-time recurrent learning and the decoupled extended Kalman filter.</description><subject>Applied sciences</subject><subject>Artificial intelligence</subject><subject>Computer science; control theory; systems</subject><subject>Connectionism. Neural networks</subject><subject>Exact sciences and technology</subject><issn>0302-9743</issn><issn>1611-3349</issn><isbn>3540424865</isbn><isbn>9783540424864</isbn><isbn>3540446680</isbn><isbn>9783540446682</isbn><fulltext>true</fulltext><rsrctype>book_chapter</rsrctype><creationdate>2001</creationdate><recordtype>book_chapter</recordtype><recordid>eNotkEtPwzAQhM1TlMIP4JYLR4PttePkiHhLFVS0nC3H2YAhTYqdquq_x5TuZaTZnZXmI-SCsyvOmL4GqiSjUuZ5QZlJ1h45hWRtHbZPRjznnALI8mC3ELLI1SEZMWCCllrCMRmVqlBCAhQn5DzGL5YGBJeiGJHpa9f6DrPZZlH1rXd0hj8r7Bxm04C1d4Pvu2zth8_szkcXcEA69wvM3tCtQsBuyF5wFWybZFj34TuekaPGthHPdzom7w_389snOnl9fL69mdAlcBiorXQFyrlKOlnW2ilnC6FsVWvFatC81LZRdeMaaMAhE1BYYDVKBVJhmecwJpf_f5c2Ots2wXbOR7MMfmHDxvCEQgkF6e76_y6mVfeBwVR9_x0Tyz-c2oBJ1MyWp9laKSF3n0OfWMTB4F_EpbKpqPu0ywFDNMA0MCZTJDdacPgFhj16EQ</recordid><startdate>2001</startdate><enddate>2001</enddate><creator>Pérez-Ortiz, Juan Antonio</creator><creator>Calera-Rubio, Jorge</creator><creator>Forcada, Mikel L.</creator><general>Springer Berlin / Heidelberg</general><general>Springer Berlin Heidelberg</general><general>Springer</general><scope>FFUUA</scope><scope>IQODW</scope></search><sort><creationdate>2001</creationdate><title>Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks</title><author>Pérez-Ortiz, Juan Antonio ; Calera-Rubio, Jorge ; Forcada, Mikel L.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p313t-ab7b35ccb4c49d7c5ca825abd750d37197af5dfcf3f3ce0238a30de45345e9663</frbrgroupid><rsrctype>book_chapters</rsrctype><prefilter>book_chapters</prefilter><language>eng</language><creationdate>2001</creationdate><topic>Applied sciences</topic><topic>Artificial intelligence</topic><topic>Computer science; control theory; systems</topic><topic>Connectionism. Neural networks</topic><topic>Exact sciences and technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Pérez-Ortiz, Juan Antonio</creatorcontrib><creatorcontrib>Calera-Rubio, Jorge</creatorcontrib><creatorcontrib>Forcada, Mikel L.</creatorcontrib><collection>ProQuest Ebook Central - Book Chapters - Demo use only</collection><collection>Pascal-Francis</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Pérez-Ortiz, Juan Antonio</au><au>Calera-Rubio, Jorge</au><au>Forcada, Mikel L.</au><au>Dorffner, Georg</au><au>Bischof, Horst</au><au>Hornik, Kurt</au><au>Hornik, Kurt</au><au>Dorffner, Georg</au><au>Bischof, Horst</au><format>book</format><genre>bookitem</genre><ristype>CHAP</ristype><atitle>Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks</atitle><btitle>Artificial Neural Networks - ICANN 2001</btitle><seriestitle>Lecture Notes in Computer Science</seriestitle><date>2001</date><risdate>2001</risdate><volume>2130</volume><spage>719</spage><epage>724</epage><pages>719-724</pages><issn>0302-9743</issn><eissn>1611-3349</eissn><isbn>3540424865</isbn><isbn>9783540424864</isbn><eisbn>3540446680</eisbn><eisbn>9783540446682</eisbn><abstract>This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical o.ine grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even from some chaotic sources. When predicting texts in human language, however, dynamics seem to be too complex to be correctly learned in real-time by the net. Two algorithms are considered for network training: real-time recurrent learning and the decoupled extended Kalman filter.</abstract><cop>Germany</cop><pub>Springer Berlin / Heidelberg</pub><doi>10.1007/3-540-44668-0_100</doi><oclcid>958524338</oclcid><tpages>6</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0302-9743 |
ispartof | Artificial Neural Networks - ICANN 2001, 2001, Vol.2130, p.719-724 |
issn | 0302-9743 1611-3349 |
language | eng |
recordid | cdi_pascalfrancis_primary_14045253 |
source | Springer Books |
subjects | Applied sciences Artificial intelligence Computer science control theory systems Connectionism. Neural networks Exact sciences and technology |
title | Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-30T18%3A15%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pasca&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=bookitem&rft.atitle=Online%20Symbolic-Sequence%20Prediction%20with%20Discrete-Time%20Recurrent%20Neural%20Networks&rft.btitle=Artificial%20Neural%20Networks%20-%20ICANN%202001&rft.au=P%C3%A9rez-Ortiz,%20Juan%20Antonio&rft.date=2001&rft.volume=2130&rft.spage=719&rft.epage=724&rft.pages=719-724&rft.issn=0302-9743&rft.eissn=1611-3349&rft.isbn=3540424865&rft.isbn_list=9783540424864&rft_id=info:doi/10.1007/3-540-44668-0_100&rft_dat=%3Cproquest_pasca%3EEBC3073004_106_721%3C/proquest_pasca%3E%3Curl%3E%3C/url%3E&rft.eisbn=3540446680&rft.eisbn_list=9783540446682&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=EBC3073004_106_721&rft_id=info:pmid/&rfr_iscdi=true |