A deep learning model with spatio-temporal graph convolutional networks for river water quality prediction

High-precision water quality prediction plays a vital role in preventing and controlling river pollution. However, river water's highly nonlinear and complex spatio-temporal dependencies pose significant challenges to water quality prediction tasks. In order to capture the spatial and temporal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Water science & technology. Water supply 2023-07, Vol.23 (7), p.2940-2957
Hauptverfasser: Huan, Juan, Liao, Wenjie, Zheng, Yongchun, Xu, Xiangen, Zhang, Hao, Shi, Bing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:High-precision water quality prediction plays a vital role in preventing and controlling river pollution. However, river water's highly nonlinear and complex spatio-temporal dependencies pose significant challenges to water quality prediction tasks. In order to capture the spatial and temporal characteristics of water quality data simultaneously, this paper combines deep learning algorithms for river water quality prediction in the river network area of Jiangnan Plain, China. A water quality prediction method based on graph convolutional network (GCN) and long short-term memory neural network (LSTM), namely spatio-temporal graph convolutional network model (ST-GCN), is proposed. Specifically, the spatio-temporal graph is constructed based on the spatio-temporal correlation between river stations, the spatial features in the river network are extracted using GCN, and the temporal correlation of water quality data is obtained by integrating LSTM. The model was evaluated using R2, MAE, and RMSE, and the experimental results were 0.977, 0.238, and 0.291, respectively. Compared with traditional water quality prediction models, the ST-GCN model has significantly improved prediction accuracy, better stability, and generalization ability.
ISSN:1606-9749
1607-0798
DOI:10.2166/ws.2023.164