Live line strain clamp's DR image anomaly detection based on unsupervised learning

Due to the high‐risk working environment of high‐voltage transmission lines, defect samples of strain clamps cannot be fully and completely collected. As a result, the deep learning method based on defect sample tags cannot effectively identify all abnormalities. To solve this problem, an unsupervis...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET generation, transmission & distribution transmission & distribution, 2023-11, Vol.17 (21), p.4717-4734
Hauptverfasser: Haoliang, Zheng, Zhiwei, Jia, Yuting, Li, Rongjie, Wang, Wenguang, Zhou
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Due to the high‐risk working environment of high‐voltage transmission lines, defect samples of strain clamps cannot be fully and completely collected. As a result, the deep learning method based on defect sample tags cannot effectively identify all abnormalities. To solve this problem, an unsupervised anomaly detection method based on knowledge distillation is proposed, which only requires a small number of normal samples to drive the model for anomaly detection. ResNet is the framework of the teacherstudent model, and the feature activation layer after ResBlock is used for knowledge transfer. Residual‐assisted attention and pyramid‐splitting attention were used to enhance the spatial perception and multi‐scale information utilization ability of the model. This model only transmits the information of normal samples and is sensitive to abnormal samples. The proposed model outperformed the baseline by 23% and individual categories by 78% on the MVTec AD (Anomaly Detection Dataset) and outperformed the baseline by 45% and individual categories by 10% on the CIFAR10 and is also reliable for Mnist and Fashion Mnist. This method performs best (82.71%) over the existing method on the self‐built data set.
ISSN:1751-8687
1751-8695
DOI:10.1049/gtd2.12756