The research on the relation of self-learning ratio and the convergence speed in BP networks

The relation of self-learning ratio and the convergence speed in BP network is proposed in this paper. In theory, only when the self-learning ratio /spl muspl rarr/0, the real gradient descent can be got, and the computation will converge to a certain local minimum point. But, a too small /spl mu/ w...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Weining Wen, Sixing Liu, Zhaoying Zhou
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The relation of self-learning ratio and the convergence speed in BP network is proposed in this paper. In theory, only when the self-learning ratio /spl muspl rarr/0, the real gradient descent can be got, and the computation will converge to a certain local minimum point. But, a too small /spl mu/ will cause a slow convergence speed and a too large /spl mu/ may cause divergence. On the base of mathematical analysis and some computer simulations, the relation formula is given out as follows: n=ln[/spl epsi|W(0)-W*|]/ln(1-/spl mu/a) where n is the amount of iterative, /spl mu/ is self-learning ratio, w(0) is the original weight and w* is the best weight, /spl epsi/ is the precision requirement, a is the slope of gradient imitative straight line. It is also proposed for a method to determine a better self-learning ratio.< >
DOI:10.1109/IMTC.1994.352107