Feature Combiners With Gate-Generated Weights for Classification

Using functional weights in a conventional linear combination architecture is a way of obtaining expressive power and represents an alternative to classical trainable and implicit nonlinear transformations. In this brief, we explore this way of constructing binary classifiers, taking advantage of th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2013-01, Vol.24 (1), p.158-163
Hauptverfasser: Omari, A., Figueiras-Vidal, A. R.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Using functional weights in a conventional linear combination architecture is a way of obtaining expressive power and represents an alternative to classical trainable and implicit nonlinear transformations. In this brief, we explore this way of constructing binary classifiers, taking advantage of the possibility of generating functional weights by means of a gate with fixed radial basis functions. This particular form of the gate permits training the machine directly with maximal margin algorithms. We call the resulting scheme "feature combiners with gate generated weights for classification." Experimental results show that these architectures outperform support vector machines (SVMs) and Real AdaBoost ensembles in most considered benchmark examples. An increase in the computational design effort due to cross-validation demands is the price to be paid to obtain this advantage. Nevertheless, the operational effort is usually lower than that needed by SVMs.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2012.2223232