Weather Visibility Prediction Based on Multimodal Fusion
Visibility affects all forms of traffic: roads, sailing, and aviation. Visibility prediction is meaningful in guiding production and life. Different from weather prediction, which relies solely on atmosphere factors, the factors that affect meteorological visibility are more complicated, such as the...
Gespeichert in:
Veröffentlicht in: | IEEE access 2019, Vol.7, p.74776-74786 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Visibility affects all forms of traffic: roads, sailing, and aviation. Visibility prediction is meaningful in guiding production and life. Different from weather prediction, which relies solely on atmosphere factors, the factors that affect meteorological visibility are more complicated, such as the air pollution caused by factory exhaust emission. However, the current prediction of visibility is mostly based on the numerical prediction method similar to the weather prediction. We proposed a method using multimodal fusion to build a weather visibility prediction system in this paper. An advanced numerical prediction model and a method for emission detection were used to build a multimodal fusion visibility prediction system. We used the most advanced regression algorithm, XGBoost, and LightGBM, to train the fusion model for numerical prediction. Through the estimation of factory emission by the traditional detector in the satellite image, we propose to add the result of estimation based on Landsat-8 satellite images to assist the prediction. By testing our numerical model in atmosphere data of various meteorological observation stations in Beijing-Tianjin-Hebei region from 2002 to 2018, our numerical prediction model turns out to be more accurate than other existing methods, and after fusing with emission detection method, the accuracy of our visibility prediction system has been further improved. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2920865 |