Extrapolating from neural network models: a cautionary tale

We present three different methods to estimate error bars on the predictions made using a neural network (NN). All of them represent lower bounds for the extrapolation errors. At first, we illustrate the methods through a simple toy model, then, we apply them to some realistic case related to nuclea...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of physics. G, Nuclear and particle physics Nuclear and particle physics, 2021-08, Vol.48 (8), p.84001
Hauptverfasser: Pastore, A, Carnini, M
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present three different methods to estimate error bars on the predictions made using a neural network (NN). All of them represent lower bounds for the extrapolation errors. At first, we illustrate the methods through a simple toy model, then, we apply them to some realistic case related to nuclear masses. By using theoretical data simulated either with a liquid-drop model or a Skyrme energy density functional, we benchmark the extrapolation performance of the NN in regions of the Segrè chart far away from the ones used for the training and validation. Finally, we discuss how error bars can help identifying when the extrapolation becomes too uncertain and thus not reliable.
ISSN:0954-3899
1361-6471
DOI:10.1088/1361-6471/abf08a