PredRANN: The spatiotemporal attention Convolution Recurrent Neural Network for precipitation nowcasting
Precipitation nowcasting is an important task in the fields of transportation, traffic, agriculture, and tourism. One of the main challenges is radar echo maps forecasting. It is regarded as a spatiotemporal sequence prediction problem. The prevailing approaches including the state-of-the-art method...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2022-03, Vol.239, p.107900, Article 107900 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Precipitation nowcasting is an important task in the fields of transportation, traffic, agriculture, and tourism. One of the main challenges is radar echo maps forecasting. It is regarded as a spatiotemporal sequence prediction problem. The prevailing approaches including the state-of-the-art methods are all based on the ConvRNN which combines the Convolution Neural Network (CNN) and Recurrent Neural Network (RNN). However, the feature flow delivered in multi-layer CNNs and RNN usually accompanies the information loss. Therefore, these algorithms fail to model the long-term dependency and the heavy rainfalls tend to be underestimated. In addition, they cannot predict the increasing intensity trend of heavy rainfalls. In this paper, we propose a PredRANN model by embedding the Temporal Attention Module (TAM) and Layer Attention Module (LAM) into the prediction unit to preserve more representation from temporal and spatial dimensions respectively. The extensive experimental results on both synthetic data sets and real world data sets demonstrate the effectiveness and superiority of the proposed method over state-of-the-art methods. Ablation studies also validate the developed TAM and LAM components. To reproduce the results, we release the source code at: https://github.com/luochuyao/PredRANN. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2021.107900 |