Dark-channel based attention and classifier retraining for smoke detection in foggy environments
Vision-based smoke detection is a quite important task in practical application. In recent years, deep learning methods with reasonable computational complexity and accuracy are widely applied to smoke detection. However, most of the deep learning methods and related datasets only consider smoke det...
Gespeichert in:
Veröffentlicht in: | Digital signal processing 2022-04, Vol.123, p.103454, Article 103454 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Vision-based smoke detection is a quite important task in practical application. In recent years, deep learning methods with reasonable computational complexity and accuracy are widely applied to smoke detection. However, most of the deep learning methods and related datasets only consider smoke detection in normal weather, which leads to a drastic performance decrease when detecting smoke in foggy environments. The existing algorithms choose to synthesize fog on the images collected in normal environments to build datasets. However, to enhance the generalization ability of the algorithms, a smoke dataset containing real smoke images collected in foggy environments and the corresponding algorithms are needed. Considering that it is not common to have smoke on foggy days, the class distribution of the dataset can be imbalanced, which limits the upper bound of performance for smoke detection. In this paper, we establish a general smoke detection dataset in foggy environments with diverse real-world collected samples. Furthermore, we develop a novel smoke detection method based on dark-channel assisted mixed attention. We also introduce a two-stage training strategy to process the imbalanced data. Experimental results demonstrate the importance of the dataset and the effectiveness of the proposed method. |
---|---|
ISSN: | 1051-2004 1095-4333 |
DOI: | 10.1016/j.dsp.2022.103454 |