Deep Learning with Long Short Term Memory Based Sequence-to-Sequence Model for Rainfall-Runoff Simulation
Accurate runoff prediction is one of the important tasks in various fields such as agriculture, hydrology, and environmental studies. Recently, with massive improvements of computational system and hardware, the deep learning-based approach has recently been applied for more accurate runoff predicti...
Gespeichert in:
Veröffentlicht in: | Water (Basel) 2021-02, Vol.13 (4), p.437 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Accurate runoff prediction is one of the important tasks in various fields such as agriculture, hydrology, and environmental studies. Recently, with massive improvements of computational system and hardware, the deep learning-based approach has recently been applied for more accurate runoff prediction. In this study, the long short-term memory model with sequence-to-sequence structure was applied for hourly runoff predictions from 2015 to 2019 in the Russian River basin, California, USA. The proposed model was used to predict hourly runoff with lead time of 1–6 h using runoff data observed at upstream stations. The model was evaluated in terms of event-based performance using the statistical metrics including root mean square error, Nash-Sutcliffe Efficiency, peak runoff error, and peak time error. The results show that proposed model outperforms support vector machine and conventional long short-term memory models. In addition, the model has the best predictive ability for runoff events, which means that it can be effective for developing short-term flood forecasting and warning systems. The results of this study demonstrate that the deep learning-based approach for hourly runoff forecasting has high predictive power and sequence-to-sequence structure is effective method to improve the prediction results. |
---|---|
ISSN: | 2073-4441 2073-4441 |
DOI: | 10.3390/w13040437 |