Optimization of the generalization capability for rainfall–runoff modeling by neural networks: the case of the Lez aquifer (southern France)

Neural networks are increasingly used in the field of hydrology due to their properties of parsimony and universal approximation with regard to nonlinear systems. Nevertheless, as a result of the existence of noise and approximations in hydrological data, which are very significant in some cases, su...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Environmental earth sciences 2012-04, Vol.65 (8), p.2365-2375
Hauptverfasser: Kong A Siou, Line, Johannet, Anne, Valérie, Borrell Estupina, Pistre, Séverin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Neural networks are increasingly used in the field of hydrology due to their properties of parsimony and universal approximation with regard to nonlinear systems. Nevertheless, as a result of the existence of noise and approximations in hydrological data, which are very significant in some cases, such systems are particularly sensitive to increased model complexity. This dilemma is known in machine learning as bias–variance and can be avoided by suitable regularization methods. Following a presentation of the bias–variance dilemma along with regularization methods such as cross-validation, early stopping and weight decay, an application is provided for simulating and forecasting karst aquifer outflows at the Lez site. The efficiency of this regularization process is thus demonstrated on a nonlinear, partially unknown basin. As a last step, results are presented over the most intense rainfall event found in the database, which allows assessing the capability of neural networks to generalize with rare or extreme events.
ISSN:1866-6280
1866-6299
DOI:10.1007/s12665-011-1450-9