Perlustration of error surfaces for nonlinear stochastic gradient descent algorithms
We attempt to explain in more detail the performance of several novel algorithms for nonlinear neural adaptive filtering. Weight trajectories together with the error surface give a clear understandable representation of the family of least mean square (LMS) based, nonlinear gradient descent (NGD), s...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We attempt to explain in more detail the performance of several novel algorithms for nonlinear neural adaptive filtering. Weight trajectories together with the error surface give a clear understandable representation of the family of least mean square (LMS) based, nonlinear gradient descent (NGD), search-then-converge (STC) learning algorithms and the real-time recurrent learning (RTRL) algorithm. Performance is measured on prediction of coloured and nonlinear input. The results are an alternative qualitative representation of different qualitative performance measures for the analysed algorithms. Error surfaces and the adjacent instantaneous prediction errors support the analysis. |
---|---|
DOI: | 10.1109/NEUREL.2002.1057958 |