Parallel distributed processing with multiple one-output back-propagation neural networks
A novel architecture of neural networks with distributed structures which is designed so that each class in the application has a one-output backpropagation subnetwork is presented. A novel architecture (one-net-one-class) can overcome the drawbacks of conventional backpropagation architectures whic...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | A novel architecture of neural networks with distributed structures which is designed so that each class in the application has a one-output backpropagation subnetwork is presented. A novel architecture (one-net-one-class) can overcome the drawbacks of conventional backpropagation architectures which must be completely retrained whenever a class is added. This architecture features complete parallel distributed processing in that the network is comprised of subnetworks each of which is a single output two-layer backpropagation which can be trained and retrieved parallely and independently. The proposed architecture also enjoys rapid convergence in both the training phase and the retrieving phase.< > |
---|---|
DOI: | 10.1109/ISCAS.1991.176636 |