Resformer-Unet: A U-shaped Framework Combining ResNet and Transformer for Segmentation of Strip Steel Surface Defects

Identifying surface defects is an essential task in the hot-rolled process. Currently, various computer vision-based classification and detection methods have achieved superior results in recognizing surface defects. However, defects typically exhibit irregular shapes caused by intra-class differenc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ISIJ International 2024/01/15, Vol.64(1), pp.67-75
Hauptverfasser: Lu, Kun, Wang, Wenyan, Pan, Xuejuan, Zhou, Yuming, Chen, Zhaoquan, Zhao, Yuan, Wang, Bing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Identifying surface defects is an essential task in the hot-rolled process. Currently, various computer vision-based classification and detection methods have achieved superior results in recognizing surface defects. However, defects typically exhibit irregular shapes caused by intra-class differences. Therefore, these two methods are unable to accurately identify the specific locations of the defects. To address this issue, this work proposes a U-shaped Encoder-Decoder framework called Resformer-Unet, which can effectively detect surface defects of hot-rolled strip at the pixel-level. In this framework, the Convolutional Neural Network (CNN) and Transformer work in parallel to extract multi-scale features from the image, which enhances the ability of network to capture both global and local information. Additionally, feature coupling modules are employed to fuse multi-scale features, thereby compensating for the information loss that occurs during down-sampling. On the SD-saliency-900 dataset for strip steel surface defect segmentation, Resformer-Unet achieves a mean Dice Similarity Coefficient (DSC) of 89.96% and an average Hausdorff Distance of 12.03%. These results outperform those of several advanced methods.
ISSN:0915-1559
1347-5460
DOI:10.2355/isijinternational.ISIJINT-2023-222