Comparative Study of High Speed Back- Propagation Learning Algorithms
Back propagation is one of the well known training algorithms for multilayer perceptron. However the rate of convergence in back propagation learning tends to be relatively slow, which in turn makes it computationally excruciating. Over the last years many modifications have been proposed to improve...
Gespeichert in:
Veröffentlicht in: | International journal of modern education and computer science 2014-12, Vol.6 (12), p.34-40 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Back propagation is one of the well known training algorithms for multilayer perceptron. However the rate of convergence in back propagation learning tends to be relatively slow, which in turn makes it computationally excruciating. Over the last years many modifications have been proposed to improve the efficiency and convergence speed of the back propagation algorithm. The main emphasis of this paper is on investigating the performance of improved versions of back propagation algorithm in training the neural network. All of them are assessed on different training sets and a comparative analysis is made. Results of computer simulations with standard benchmark problems such as XOR, 3 BIT PARITY, MODIFIED XOR and IRIS are presented. The training performance of these algorithms is evaluated in terms of percentage of accuracy, and convergence speed. |
---|---|
ISSN: | 2075-0161 2075-017X |
DOI: | 10.5815/ijmecs.2014.12.05 |