Effective Dual-Feature Fusion Network for Transmission Line Detection

In recent years, fused infrared and visible-light computer vision methods based on fully convolutional networks (FCNs) have achieved remarkable results due to their complementary features. However, the growing requirements for computer vision tasks continually demand an increasing number of paramete...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2024-01, Vol.24 (1), p.101-109
Hauptverfasser: Zhou, Wujie, Ji, Chuanming, Fang, Meixin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In recent years, fused infrared and visible-light computer vision methods based on fully convolutional networks (FCNs) have achieved remarkable results due to their complementary features. However, the growing requirements for computer vision tasks continually demand an increasing number of parameters, leading to computational costs that are considerably high for deployment on mobile devices. Hence, building a small and capable lightweight student model supervised and trained by a large teacher model with superior performance remains a key requirement. To help move the field in this direction, this study proposes a dual-feature fusion student network with knowledge distillation (KD) (EDFNet- \text{S}^{\ast} {)} for transmission line detection (TLD) scene parsing. Infrared data were used to complement visible image features and extract accurate edge and contour information for cross fusion. A cascaded U-shaped shunted architecture was adopted to integrate the features of each hierarchy, and extensive experiments showed that the proposed EDFNet- \text{S}^{\ast} achieved excellent performance on TLD tasks, operating with only 18.3M parameters compared with the teacher (EDFNet-T) network's 86.93M. Moreover, the total floating-point operations (FLOPs) were reduced from 25.23G to 8.43G FLOPs.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2023.3333322