Unsupervised fabric defect detection with high-frequency feature mapping

Fabric defect detection is an important and necessary step in textile mills, and many deep learning-based methods have been proposed to perform defect detection and segmentation for fabric images. However, they still suffer from the lack of labor-intensive and high- cost labeled fabric images and th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2024-02, Vol.83 (7), p.21615-21632
Hauptverfasser: Wan, Da, Gao, Can, Zhou, Jie, Shen, Xinrui, Shen, Linlin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Fabric defect detection is an important and necessary step in textile mills, and many deep learning-based methods have been proposed to perform defect detection and segmentation for fabric images. However, they still suffer from the lack of labor-intensive and high- cost labeled fabric images and the difficulty in finding discriminative feature representations of fabric defects. To address the problem, a novel unsupervised High-Frequency Feature Mapping Model (HFFMM) is proposed for fabric defect detection. First, aiming to capture the rich high-frequency information in defective images, a multi-scale high-frequency information extraction module is designed for the generation of high-frequency fabric images. Subsequently, to notice the differences between the original and high-frequency features, a query-key attention module is proposed to obtain the fused mapping matrix to improve the mapping capability. Finally, the two features transformed by the mapping matrix are compared to detect defects. Extensive experiments conducted on four public fabric datasets show that our method outperforms other state-of-the-art methods, especially for fabric images with regular textures, and achieves an average AUC improvement of 5.3% in detection and 2.9% in segmentation.
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-023-16340-7