FireClassNet: a deep convolutional neural network approach for PJF fire images classification
In these recent years, the world has been faced with fires’ outbreak, which represented the most serious problem causing huge casualties and considerable destructions. It is therefore essential to early detect fire in video surveillance scenes accurately and reliably in order to overcome the common...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2023-09, Vol.35 (26), p.19069-19085 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In these recent years, the world has been faced with fires’ outbreak, which represented the most serious problem causing huge casualties and considerable destructions. It is therefore essential to early detect fire in video surveillance scenes accurately and reliably in order to overcome the common weaknesses of the available flame detection methods. Nowadays, exploring the recent deep learning (DL)-based methods within the modern surveillance systems has become a great challenge. Thereby, a novel DL-based approach is introduced for fire images detection in this paper. It is based on a convolutional neural network architecture (CNN), designed from scratch and named Fire Classification Network “FireClassNet.” Firstly, the input frames are preprocessed to highlight fire regions. Then, they are fed into the proposed “FireClassNet” in order to train the classification model. The presented network includes small number of layers when compared to existing CNNs, resulting in fewer parameters number. Experiments show the effectiveness of the produced model on the constructed dataset in terms of improving the accuracy which reaches 99.73%. It is also demonstrated that the developed model is able to clearly outperform the related methods and the baseline CNN architectures for fire frames classification. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-023-08750-3 |