Generalized neural trees for pattern classification
In this paper, a new neural tree (NT) model, the generalized NT (GNT), is presented. The main novelty of the GNT consists in the definition of a new training rule that performs an overall optimization of the tree. Each time the tree is increased by a new level, the whole tree is reevaluated. The tra...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2002-11, Vol.13 (6), p.1540-1547 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, a new neural tree (NT) model, the generalized NT (GNT), is presented. The main novelty of the GNT consists in the definition of a new training rule that performs an overall optimization of the tree. Each time the tree is increased by a new level, the whole tree is reevaluated. The training rule uses a weight correction strategy that takes into account the entire tree structure, and it applies a normalization procedure to the activation values of each node such that these values can be interpreted as a probability. The weight connection updating is calculated by minimizing a cost function, which represents a measure of the overall probability of correct classification. Significant results on both synthetic and real data have been obtained by comparing the classification performances among multilayer perceptrons (MLPs), NTs, and GNTs. In particular, the GNT model displays good classification performances for training sets having complex distributions. Moreover, its particular structure provides an easily probabilistic interpretation of the pattern classification task and allows growing small neural trees with good generalization properties. |
---|---|
ISSN: | 1045-9227 2162-237X 1941-0093 2162-2388 |
DOI: | 10.1109/TNN.2002.804290 |