Fire smoke detection based on target-awareness and depthwise convolutions

Because smoke usually appears before a flame arises, fire smoke detection is significant for early warning systems. This paper proposes a TADS (Target-awareness and Depthwise Separability) algorithm based on target-awareness and depthwise separability. Current deep learning methods with pre-trained...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2021-07, Vol.80 (18), p.27407-27421
Hauptverfasser: Zhao, Yunji, Zhang, Haibo, Zhang, Xinliang, Chen, Xiangjun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Because smoke usually appears before a flame arises, fire smoke detection is significant for early warning systems. This paper proposes a TADS (Target-awareness and Depthwise Separability) algorithm based on target-awareness and depthwise separability. Current deep learning methods with pre-trained convolutional neural networks by abundant and vast datasets are used to realize generic object recognition tasks. As for smoke detection, collecting large quantities of smoke data is challenging for small sample smoke objects. The basis is that the objects of interest can be arbitrary object classes with arbitrary forms. Thus, deep feature maps acquired by target-awareness pre-trained networks are used for modeling these objects of arbitrary forms to distinguish them from unpredictable and complex environments. The authors introduced this scheme to deal with smoke detection. The depthwise separable method with a fixed convolution kernel replacing the training iterations can improve the algorithm’s speed to meet the enhanced requirements of real-time fire spreading for detecting speed. The experimental results demonstrate that the proposed algorithm can detect early smoke in real-time, and it is superior to the state-of-the-art methods in terms of accuracy and speed.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-021-11037-1