Spatio-Temporal Network for Sea Fog Forecasting

Sea fog can seriously affect schedules and safety by reducing visibility during marine transportation. Therefore, the forecasting of sea fog is an important issue in preventing accidents. Recently, in order to forecast sea fog, several deep learning methods have been applied to time series data cons...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sustainability 2022-12, Vol.14 (23), p.16163
Hauptverfasser: Park, Jinhyeok, Lee, Young Jae, Jo, Yongwon, Kim, Jaehoon, Han, Jin Hyun, Kim, Kuk Jin, Kim, Young Taeg, Kim, Seoung Bum
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Sea fog can seriously affect schedules and safety by reducing visibility during marine transportation. Therefore, the forecasting of sea fog is an important issue in preventing accidents. Recently, in order to forecast sea fog, several deep learning methods have been applied to time series data consisting of meteorological and oceanographic observations or image data to predict fog. However, these methods only use a single image without considering meteorological and temporal characteristics. In this study, we propose a multi-modal learning method to improve the forecasting accuracy of sea fog using convolutional neural network (CNN) and gated recurrent unit (GRU) models. CNN and GRU extract useful features from closed-circuit television (CCTV) images and multivariate time series data, respectively. CCTV images and time series data collected at Daesan Port in South Korea from 1 March 2018 to 14 February 2021 by Korea Hydrographic and Oceanographic Agency (KHOA) were used to evaluate the proposed method. We compare the proposed method with deep learning methods that only consider temporal information or spatial information. The results indicate that the proposed method using both temporal and spatial information at the same time shows superior accuracy.
ISSN:2071-1050
2071-1050
DOI:10.3390/su142316163