Developing a Discharge Estimation Model for Ungauged Watershed Using CNN and Hydrological Image

This study aimed to estimate the discharge in ungauged watersheds. To this end, we herein deviated from the model development methodology of previous studies and used convolution neural network (CNN), a deep training algorithm, and hydrological images. As the CNN model was developed for solving clas...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Water (Basel) 2020-12, Vol.12 (12), p.3534
Hauptverfasser: Kim, Da Ye, Song, Chul Min
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study aimed to estimate the discharge in ungauged watersheds. To this end, we herein deviated from the model development methodology of previous studies and used convolution neural network (CNN), a deep training algorithm, and hydrological images. As the CNN model was developed for solving classification issues in general, it is unsuitable for simulating the discharge, which is a continuous variable. Therefore, the fully connected layer of the CNN model was improved. Moreover, images reflecting the hydrological conditions rather than a general photograph were used as input data for the CNN model. Three study areas that have discharge gauged data were set for the model’s training and testing. The data from two of the three study areas were used for CNN model training, and the data of the other were used to evaluate model prediction performance. The results of this study demonstrate a moderate predictive success of the discharge of an ungauged watershed using the CNN model and hydrological images. Therefore, it can be suitable as a methodology for the discharge estimation of ungauged watersheds. Simultaneously, it is expected that our methodology can be applied to the field of remote sensing or to the field of real-time discharge simulation using satellite imagery on a global scale or across a wide area.
ISSN:2073-4441
2073-4441
DOI:10.3390/w12123534