A Kind of Second-Order Learning Algorithm Based on Generalized Cost Criteria in Multi-Layer Feed-Forward Neural Networks

A kind of second-order algorithm——recursive approximate Newton algorithm was given by Karayian—nis. The algorithm was simplified when it was formulated. Especially, the simplification to matrix Hessian was very reluctant, which led to the loss of valuable information and affected performance of the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:北京理工大学学报(英文版) 2003-06, Vol.12 (2), p.119-124
1. Verfasser: 张长江 付梦印 金梅
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A kind of second-order algorithm——recursive approximate Newton algorithm was given by Karayian—nis. The algorithm was simplified when it was formulated. Especially, the simplification to matrix Hessian was very reluctant, which led to the loss of valuable information and affected performance of the algorithm to certain extent. For multi-layer feed-forward neural networks, the second-order back-propagation recursive algorithm based generalized cost criteria was proposed. It is proved that it is equivalent to Newton recursive algorithm and has a second-order convergent rate. The performance and application prospect are analyzed. Lots of simulation ex-periments indicate that the calculation of the new algorithm is almost equivalent to the recursive least square multi-ple algorithm. The algorithm and selection of networks parameters are significant and the performance is more ex-cellent than BP algorithm and the second-order learning algorithm that was given by Karayiannis.
ISSN:1004-0579
DOI:10.3969/j.issn.1004-0579.2003.02.002