Evolution of functional link networks

This paper addresses the genetic design of functional link networks (FLN). FLN are high-order perceptrons (HOP) without hidden units. Despite their linear nature, FLN can capture nonlinear input-output relationships, provided that they are fed with an adequate set of polynomial inputs, which are con...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on evolutionary computation 2001-02, Vol.5 (1), p.54-65
Hauptverfasser: Sierra, A., Macias, J.A., Corbacho, F.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper addresses the genetic design of functional link networks (FLN). FLN are high-order perceptrons (HOP) without hidden units. Despite their linear nature, FLN can capture nonlinear input-output relationships, provided that they are fed with an adequate set of polynomial inputs, which are constructed out of the original input attributes. Given this set, it turns out to be very simple to train the network, as compared with a multilayer perceptron (MLP). However finding the optimal subset of units is a difficult problem because of its nongradient nature and the large number of available units, especially for high degrees. Some constructive growing methods have been proposed to address this issue, Here, we rely on the global search capabilities of a genetic algorithm to scan the space of subsets of polynomial units, which is plagued by a host of local minima. By contrast, the quadratic error function of each individual FLN has only one minimum, which makes fitness evaluation practically noiseless. We find that surprisingly simple FLN compare favorably with other more complex architectures derived by means of constructive and evolutionary algorithms on some UCI benchmark data sets. Moreover, our models are especially amenable to interpretation, due to an incremental approach that penalizes complex architectures and starts with a pool of single-attribute FLN.
ISSN:1089-778X
1941-0026
DOI:10.1109/4235.910465