A constructive algorithm to synthesize arbitrarily connected feedforward neural networks

In this work we present a constructive algorithm capable of producing arbitrarily connected feedforward neural network architectures for classification problems. Architecture and synaptic weights of the neural network should be defined by the learning procedure. The main purpose is to obtain a parsi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neurocomputing (Amsterdam) 2012, Vol.75 (1), p.14-32
Hauptverfasser: Puma-Villanueva, Wilfredo J., dos Santos, Eurípedes P., Von Zuben, Fernando J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 32
container_issue 1
container_start_page 14
container_title Neurocomputing (Amsterdam)
container_volume 75
creator Puma-Villanueva, Wilfredo J.
dos Santos, Eurípedes P.
Von Zuben, Fernando J.
description In this work we present a constructive algorithm capable of producing arbitrarily connected feedforward neural network architectures for classification problems. Architecture and synaptic weights of the neural network should be defined by the learning procedure. The main purpose is to obtain a parsimonious neural network, in the form of a hybrid and dedicate linear/nonlinear classification model, which can guide to high levels of performance in terms of generalization. Though not being a global optimization algorithm, nor a population-based metaheuristics, the constructive approach has mechanisms to avoid premature convergence, by mixing growing and pruning processes, and also by implementing a relaxation strategy for the learning error. The synaptic weights of the neural networks produced by the constructive mechanism are adjusted by a quasi-Newton method, and the decision to grow or prune the current network is based on a mutual information criterion. A set of benchmark experiments, including artificial and real datasets, indicates that the new proposal presents a favorable performance when compared with alternative approaches in the literature, such as traditional MLP, mixture of heterogeneous experts, cascade correlation networks and an evolutionary programming system, in terms of both classification accuracy and parsimony of the obtained classifier.
doi_str_mv 10.1016/j.neucom.2011.05.025
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1671371150</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0925231211004061</els_id><sourcerecordid>1671371150</sourcerecordid><originalsourceid>FETCH-LOGICAL-c339t-587366e2aeda493edc15979e9edb56cf518d3edb7e1a783f3b02f5f3605611ae3</originalsourceid><addsrcrecordid>eNp9kEtLAzEUhYMoWKv_wMUs3cyYmzTz2Ail-ALBjYK7kEnu2NTppCZpS_31poxrVwcO5xzu_Qi5BloAhfJ2VQy41W5dMApQUFFQJk7IBOqK5TWry1MyoQ0TOePAzslFCCtKoQLWTMjHPNNuCNFvdbQ7zFT_6byNy3UWXRYOQ1xisD_J962NXnnbH46FAXVEk3WIpnN-r7zJ0gle9Uni3vmvcEnOOtUHvPrTKXl_uH9bPOUvr4_Pi_lLrjlvYi7qipclMoVGzRqORoNoqgYbNK0odSegNsltKwRV1bzjLWWd6HhJRQmgkE_Jzbi78e57iyHKtQ0a-14N6LZBQlkBrwAETdHZGNXeheCxkxtv18ofJFB5BClXcgQpjyAlFTKBTLW7sYbpjZ1FL4O2OGg01icM0jj7_8Avd6KApw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1671371150</pqid></control><display><type>article</type><title>A constructive algorithm to synthesize arbitrarily connected feedforward neural networks</title><source>Elsevier ScienceDirect Journals</source><creator>Puma-Villanueva, Wilfredo J. ; dos Santos, Eurípedes P. ; Von Zuben, Fernando J.</creator><creatorcontrib>Puma-Villanueva, Wilfredo J. ; dos Santos, Eurípedes P. ; Von Zuben, Fernando J.</creatorcontrib><description>In this work we present a constructive algorithm capable of producing arbitrarily connected feedforward neural network architectures for classification problems. Architecture and synaptic weights of the neural network should be defined by the learning procedure. The main purpose is to obtain a parsimonious neural network, in the form of a hybrid and dedicate linear/nonlinear classification model, which can guide to high levels of performance in terms of generalization. Though not being a global optimization algorithm, nor a population-based metaheuristics, the constructive approach has mechanisms to avoid premature convergence, by mixing growing and pruning processes, and also by implementing a relaxation strategy for the learning error. The synaptic weights of the neural networks produced by the constructive mechanism are adjusted by a quasi-Newton method, and the decision to grow or prune the current network is based on a mutual information criterion. A set of benchmark experiments, including artificial and real datasets, indicates that the new proposal presents a favorable performance when compared with alternative approaches in the literature, such as traditional MLP, mixture of heterogeneous experts, cascade correlation networks and an evolutionary programming system, in terms of both classification accuracy and parsimony of the obtained classifier.</description><identifier>ISSN: 0925-2312</identifier><identifier>EISSN: 1872-8286</identifier><identifier>DOI: 10.1016/j.neucom.2011.05.025</identifier><language>eng</language><publisher>Elsevier B.V</publisher><subject>Algorithms ; Arbitrary architectures ; Cascades ; Classification ; Construction ; Constructive learning ; Feedforward ; Learning ; Networks ; Neural networks</subject><ispartof>Neurocomputing (Amsterdam), 2012, Vol.75 (1), p.14-32</ispartof><rights>2011 Elsevier B.V.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c339t-587366e2aeda493edc15979e9edb56cf518d3edb7e1a783f3b02f5f3605611ae3</citedby><cites>FETCH-LOGICAL-c339t-587366e2aeda493edc15979e9edb56cf518d3edb7e1a783f3b02f5f3605611ae3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0925231211004061$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,4010,27900,27901,27902,65306</link.rule.ids></links><search><creatorcontrib>Puma-Villanueva, Wilfredo J.</creatorcontrib><creatorcontrib>dos Santos, Eurípedes P.</creatorcontrib><creatorcontrib>Von Zuben, Fernando J.</creatorcontrib><title>A constructive algorithm to synthesize arbitrarily connected feedforward neural networks</title><title>Neurocomputing (Amsterdam)</title><description>In this work we present a constructive algorithm capable of producing arbitrarily connected feedforward neural network architectures for classification problems. Architecture and synaptic weights of the neural network should be defined by the learning procedure. The main purpose is to obtain a parsimonious neural network, in the form of a hybrid and dedicate linear/nonlinear classification model, which can guide to high levels of performance in terms of generalization. Though not being a global optimization algorithm, nor a population-based metaheuristics, the constructive approach has mechanisms to avoid premature convergence, by mixing growing and pruning processes, and also by implementing a relaxation strategy for the learning error. The synaptic weights of the neural networks produced by the constructive mechanism are adjusted by a quasi-Newton method, and the decision to grow or prune the current network is based on a mutual information criterion. A set of benchmark experiments, including artificial and real datasets, indicates that the new proposal presents a favorable performance when compared with alternative approaches in the literature, such as traditional MLP, mixture of heterogeneous experts, cascade correlation networks and an evolutionary programming system, in terms of both classification accuracy and parsimony of the obtained classifier.</description><subject>Algorithms</subject><subject>Arbitrary architectures</subject><subject>Cascades</subject><subject>Classification</subject><subject>Construction</subject><subject>Constructive learning</subject><subject>Feedforward</subject><subject>Learning</subject><subject>Networks</subject><subject>Neural networks</subject><issn>0925-2312</issn><issn>1872-8286</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><recordid>eNp9kEtLAzEUhYMoWKv_wMUs3cyYmzTz2Ail-ALBjYK7kEnu2NTppCZpS_31poxrVwcO5xzu_Qi5BloAhfJ2VQy41W5dMApQUFFQJk7IBOqK5TWry1MyoQ0TOePAzslFCCtKoQLWTMjHPNNuCNFvdbQ7zFT_6byNy3UWXRYOQ1xisD_J962NXnnbH46FAXVEk3WIpnN-r7zJ0gle9Uni3vmvcEnOOtUHvPrTKXl_uH9bPOUvr4_Pi_lLrjlvYi7qipclMoVGzRqORoNoqgYbNK0odSegNsltKwRV1bzjLWWd6HhJRQmgkE_Jzbi78e57iyHKtQ0a-14N6LZBQlkBrwAETdHZGNXeheCxkxtv18ofJFB5BClXcgQpjyAlFTKBTLW7sYbpjZ1FL4O2OGg01icM0jj7_8Avd6KApw</recordid><startdate>2012</startdate><enddate>2012</enddate><creator>Puma-Villanueva, Wilfredo J.</creator><creator>dos Santos, Eurípedes P.</creator><creator>Von Zuben, Fernando J.</creator><general>Elsevier B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>2012</creationdate><title>A constructive algorithm to synthesize arbitrarily connected feedforward neural networks</title><author>Puma-Villanueva, Wilfredo J. ; dos Santos, Eurípedes P. ; Von Zuben, Fernando J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c339t-587366e2aeda493edc15979e9edb56cf518d3edb7e1a783f3b02f5f3605611ae3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Algorithms</topic><topic>Arbitrary architectures</topic><topic>Cascades</topic><topic>Classification</topic><topic>Construction</topic><topic>Constructive learning</topic><topic>Feedforward</topic><topic>Learning</topic><topic>Networks</topic><topic>Neural networks</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Puma-Villanueva, Wilfredo J.</creatorcontrib><creatorcontrib>dos Santos, Eurípedes P.</creatorcontrib><creatorcontrib>Von Zuben, Fernando J.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Neurocomputing (Amsterdam)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Puma-Villanueva, Wilfredo J.</au><au>dos Santos, Eurípedes P.</au><au>Von Zuben, Fernando J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A constructive algorithm to synthesize arbitrarily connected feedforward neural networks</atitle><jtitle>Neurocomputing (Amsterdam)</jtitle><date>2012</date><risdate>2012</risdate><volume>75</volume><issue>1</issue><spage>14</spage><epage>32</epage><pages>14-32</pages><issn>0925-2312</issn><eissn>1872-8286</eissn><abstract>In this work we present a constructive algorithm capable of producing arbitrarily connected feedforward neural network architectures for classification problems. Architecture and synaptic weights of the neural network should be defined by the learning procedure. The main purpose is to obtain a parsimonious neural network, in the form of a hybrid and dedicate linear/nonlinear classification model, which can guide to high levels of performance in terms of generalization. Though not being a global optimization algorithm, nor a population-based metaheuristics, the constructive approach has mechanisms to avoid premature convergence, by mixing growing and pruning processes, and also by implementing a relaxation strategy for the learning error. The synaptic weights of the neural networks produced by the constructive mechanism are adjusted by a quasi-Newton method, and the decision to grow or prune the current network is based on a mutual information criterion. A set of benchmark experiments, including artificial and real datasets, indicates that the new proposal presents a favorable performance when compared with alternative approaches in the literature, such as traditional MLP, mixture of heterogeneous experts, cascade correlation networks and an evolutionary programming system, in terms of both classification accuracy and parsimony of the obtained classifier.</abstract><pub>Elsevier B.V</pub><doi>10.1016/j.neucom.2011.05.025</doi><tpages>19</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0925-2312
ispartof Neurocomputing (Amsterdam), 2012, Vol.75 (1), p.14-32
issn 0925-2312
1872-8286
language eng
recordid cdi_proquest_miscellaneous_1671371150
source Elsevier ScienceDirect Journals
subjects Algorithms
Arbitrary architectures
Cascades
Classification
Construction
Constructive learning
Feedforward
Learning
Networks
Neural networks
title A constructive algorithm to synthesize arbitrarily connected feedforward neural networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T18%3A01%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20constructive%20algorithm%20to%20synthesize%20arbitrarily%20connected%20feedforward%20neural%20networks&rft.jtitle=Neurocomputing%20(Amsterdam)&rft.au=Puma-Villanueva,%20Wilfredo%20J.&rft.date=2012&rft.volume=75&rft.issue=1&rft.spage=14&rft.epage=32&rft.pages=14-32&rft.issn=0925-2312&rft.eissn=1872-8286&rft_id=info:doi/10.1016/j.neucom.2011.05.025&rft_dat=%3Cproquest_cross%3E1671371150%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1671371150&rft_id=info:pmid/&rft_els_id=S0925231211004061&rfr_iscdi=true