Estimations of error bounds for neural-network function approximators

Neural networks are being increasingly used for problems involving function approximation. However, a key limitation of neural methods is the lack of a measure of how much confidence can be placed in output estimates. In the last few years many authors have addressed this shortcoming from various an...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on neural networks 1999-03, Vol.10 (2), p.217-230
Hauptverfasser: Townsend, N.W., Tarassenko, L.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Neural networks are being increasingly used for problems involving function approximation. However, a key limitation of neural methods is the lack of a measure of how much confidence can be placed in output estimates. In the last few years many authors have addressed this shortcoming from various angles, focusing primarily on predicting output bounds as a function of the trained network's characteristics, typically as defined by the Hessian matrix. In this paper the problem of the effect of errors or noise in the presented input vector is examined, and a method based on perturbation analysis of determining output bounds from the error in the input vector and the imperfections in the weight values after training is also presented and demonstrated.
ISSN:1045-9227
1941-0093
DOI:10.1109/72.750542