A SOFT-backpropagation algorithm for training neural networks
The backpropagation (BP) algorithm is a one of the most common algorithms used in the training of neural networks. The single offspring technique (SOFT algorithm) is a new technique (see Likartsis, A. et al., Proc. 9th Int. Conf. on Tools with Artificial Intelligence, p.32-6, 1997; Yao, X., Proc. IE...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The backpropagation (BP) algorithm is a one of the most common algorithms used in the training of neural networks. The single offspring technique (SOFT algorithm) is a new technique (see Likartsis, A. et al., Proc. 9th Int. Conf. on Tools with Artificial Intelligence, p.32-6, 1997; Yao, X., Proc. IEEE, vol.87, p.1425-47, 1999) of applying the genetic algorithm in the training of neural networks which reduces the training time as compared with the backpropagation algorithm. We introduce a new technique. This technique is a hybrid SOFT-BP algorithm where the SOFT-algorithm is applied first to obtain an initially good weight vector. This vector is introduced to the backpropagation algorithm, which improves the precession of the weight vector to reach an acceptable error limit. The results show an acceptable improvement in the training speed for the hybrid technique as compared with the individual backpropagation or SOFT algorithm. We also study the success ratio (how many times the algorithm succeeds in finding a solution to the total number of trials) for the new hybrid algorithm. A recommended range of the switching error limit at which to switch from the SOFT algorithm to the BP algorithm is suggested. |
---|---|
DOI: | 10.1109/NRSC.2002.1022647 |