Comparative evaluation of genetic algorithm and backpropagation for training neural networks
In view of several limitations of gradient search techniques (e.g. backpropagation), global search techniques, including evolutionary programming and genetic algorithms (GAs), have been proposed for training neural networks (NNs). However, the effectiveness, ease-of-use, and efficiency of these glob...
Gespeichert in:
Veröffentlicht in: | Information sciences 2000-11, Vol.129 (1), p.45-59 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 59 |
---|---|
container_issue | 1 |
container_start_page | 45 |
container_title | Information sciences |
container_volume | 129 |
creator | Sexton, Randall S. Gupta, Jatinder N.D. |
description | In view of several limitations of gradient search techniques (e.g. backpropagation), global search techniques, including evolutionary programming and genetic algorithms (GAs), have been proposed for training neural networks (NNs). However, the effectiveness, ease-of-use, and efficiency of these global search techniques have not been compared extensively with gradient search techniques. Using five chaotic time series functions, this paper empirically compares a genetic algorithm with backpropagation for training NNs. The chaotic series are interesting because of their similarity to economic and financial series found in financial markets. |
doi_str_mv | 10.1016/S0020-0255(00)00068-2 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_27232563</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0020025500000682</els_id><sourcerecordid>27232563</sourcerecordid><originalsourceid>FETCH-LOGICAL-c404t-bd37aa125417a80b8706abb6466f775b2c0bd7569ff40732cc57bf47ea099b583</originalsourceid><addsrcrecordid>eNqFkE1LxDAQhoMouK7-BKEn0UN1mjZJexJZ_ALBg3oTwiRNa9y2qUm74r-3a8Wrp3cOzzvMPIQcJ3CeQMIvngAoxEAZOwU4AwCex3SHLJJc0JjTItkliz9knxyE8D5BmeB8QV5Xru3R42A3JjIbbMZpdF3kqqg2nRmsjrCpnbfDWxthV0YK9br3rsd6Bivno8Gj7WxXR50ZPTZTDJ_Or8Mh2auwCeboN5fk5eb6eXUXPzze3q-uHmKdQTbEqkwFYkJZlgjMQeUCOCrFM84rIZiiGlQpGC-qKgORUq2ZUFUmDEJRKJanS3Iy750O-xhNGGRrgzZNg51xY5BU0JQynk4gm0HtXQjeVLL3tkX_JROQW5fyx6XcipIw5dalpFPvcu6Z6YuNNV4GbU2nTWm90YMsnf1nwzcXnHza</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>27232563</pqid></control><display><type>article</type><title>Comparative evaluation of genetic algorithm and backpropagation for training neural networks</title><source>Elsevier ScienceDirect Journals Complete</source><creator>Sexton, Randall S. ; Gupta, Jatinder N.D.</creator><creatorcontrib>Sexton, Randall S. ; Gupta, Jatinder N.D.</creatorcontrib><description>In view of several limitations of gradient search techniques (e.g. backpropagation), global search techniques, including evolutionary programming and genetic algorithms (GAs), have been proposed for training neural networks (NNs). However, the effectiveness, ease-of-use, and efficiency of these global search techniques have not been compared extensively with gradient search techniques. Using five chaotic time series functions, this paper empirically compares a genetic algorithm with backpropagation for training NNs. The chaotic series are interesting because of their similarity to economic and financial series found in financial markets.</description><identifier>ISSN: 0020-0255</identifier><identifier>EISSN: 1872-6291</identifier><identifier>DOI: 10.1016/S0020-0255(00)00068-2</identifier><language>eng</language><publisher>Elsevier Inc</publisher><subject>Backpropagation ; Epoch ; Genetic algorithms ; Global search algorithms ; Interpolation ; Neural network training</subject><ispartof>Information sciences, 2000-11, Vol.129 (1), p.45-59</ispartof><rights>2000</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c404t-bd37aa125417a80b8706abb6466f775b2c0bd7569ff40732cc57bf47ea099b583</citedby><cites>FETCH-LOGICAL-c404t-bd37aa125417a80b8706abb6466f775b2c0bd7569ff40732cc57bf47ea099b583</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0020025500000682$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65534</link.rule.ids></links><search><creatorcontrib>Sexton, Randall S.</creatorcontrib><creatorcontrib>Gupta, Jatinder N.D.</creatorcontrib><title>Comparative evaluation of genetic algorithm and backpropagation for training neural networks</title><title>Information sciences</title><description>In view of several limitations of gradient search techniques (e.g. backpropagation), global search techniques, including evolutionary programming and genetic algorithms (GAs), have been proposed for training neural networks (NNs). However, the effectiveness, ease-of-use, and efficiency of these global search techniques have not been compared extensively with gradient search techniques. Using five chaotic time series functions, this paper empirically compares a genetic algorithm with backpropagation for training NNs. The chaotic series are interesting because of their similarity to economic and financial series found in financial markets.</description><subject>Backpropagation</subject><subject>Epoch</subject><subject>Genetic algorithms</subject><subject>Global search algorithms</subject><subject>Interpolation</subject><subject>Neural network training</subject><issn>0020-0255</issn><issn>1872-6291</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2000</creationdate><recordtype>article</recordtype><recordid>eNqFkE1LxDAQhoMouK7-BKEn0UN1mjZJexJZ_ALBg3oTwiRNa9y2qUm74r-3a8Wrp3cOzzvMPIQcJ3CeQMIvngAoxEAZOwU4AwCex3SHLJJc0JjTItkliz9knxyE8D5BmeB8QV5Xru3R42A3JjIbbMZpdF3kqqg2nRmsjrCpnbfDWxthV0YK9br3rsd6Bivno8Gj7WxXR50ZPTZTDJ_Or8Mh2auwCeboN5fk5eb6eXUXPzze3q-uHmKdQTbEqkwFYkJZlgjMQeUCOCrFM84rIZiiGlQpGC-qKgORUq2ZUFUmDEJRKJanS3Iy750O-xhNGGRrgzZNg51xY5BU0JQynk4gm0HtXQjeVLL3tkX_JROQW5fyx6XcipIw5dalpFPvcu6Z6YuNNV4GbU2nTWm90YMsnf1nwzcXnHza</recordid><startdate>20001101</startdate><enddate>20001101</enddate><creator>Sexton, Randall S.</creator><creator>Gupta, Jatinder N.D.</creator><general>Elsevier Inc</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20001101</creationdate><title>Comparative evaluation of genetic algorithm and backpropagation for training neural networks</title><author>Sexton, Randall S. ; Gupta, Jatinder N.D.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c404t-bd37aa125417a80b8706abb6466f775b2c0bd7569ff40732cc57bf47ea099b583</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2000</creationdate><topic>Backpropagation</topic><topic>Epoch</topic><topic>Genetic algorithms</topic><topic>Global search algorithms</topic><topic>Interpolation</topic><topic>Neural network training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Sexton, Randall S.</creatorcontrib><creatorcontrib>Gupta, Jatinder N.D.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Information sciences</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sexton, Randall S.</au><au>Gupta, Jatinder N.D.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Comparative evaluation of genetic algorithm and backpropagation for training neural networks</atitle><jtitle>Information sciences</jtitle><date>2000-11-01</date><risdate>2000</risdate><volume>129</volume><issue>1</issue><spage>45</spage><epage>59</epage><pages>45-59</pages><issn>0020-0255</issn><eissn>1872-6291</eissn><abstract>In view of several limitations of gradient search techniques (e.g. backpropagation), global search techniques, including evolutionary programming and genetic algorithms (GAs), have been proposed for training neural networks (NNs). However, the effectiveness, ease-of-use, and efficiency of these global search techniques have not been compared extensively with gradient search techniques. Using five chaotic time series functions, this paper empirically compares a genetic algorithm with backpropagation for training NNs. The chaotic series are interesting because of their similarity to economic and financial series found in financial markets.</abstract><pub>Elsevier Inc</pub><doi>10.1016/S0020-0255(00)00068-2</doi><tpages>15</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0020-0255 |
ispartof | Information sciences, 2000-11, Vol.129 (1), p.45-59 |
issn | 0020-0255 1872-6291 |
language | eng |
recordid | cdi_proquest_miscellaneous_27232563 |
source | Elsevier ScienceDirect Journals Complete |
subjects | Backpropagation Epoch Genetic algorithms Global search algorithms Interpolation Neural network training |
title | Comparative evaluation of genetic algorithm and backpropagation for training neural networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T16%3A30%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Comparative%20evaluation%20of%20genetic%20algorithm%20and%20backpropagation%20for%20training%20neural%20networks&rft.jtitle=Information%20sciences&rft.au=Sexton,%20Randall%20S.&rft.date=2000-11-01&rft.volume=129&rft.issue=1&rft.spage=45&rft.epage=59&rft.pages=45-59&rft.issn=0020-0255&rft.eissn=1872-6291&rft_id=info:doi/10.1016/S0020-0255(00)00068-2&rft_dat=%3Cproquest_cross%3E27232563%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=27232563&rft_id=info:pmid/&rft_els_id=S0020025500000682&rfr_iscdi=true |