Deep Learning-Based Multi-Species Appearance Defect Detection Model for MLCC
The appearance defects in Multilayer Ceramic Capacitor (MLCC) adversely affect its performance and reliability. Thus, detecting these defects during MLCC production is imperative. However, this task faces numerous challenges, such as significant variations in the shape and size of defects, indistinc...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2024-01, Vol.73, p.1-1 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The appearance defects in Multilayer Ceramic Capacitor (MLCC) adversely affect its performance and reliability. Thus, detecting these defects during MLCC production is imperative. However, this task faces numerous challenges, such as significant variations in the shape and size of defects, indistinct defect boundaries, and the inefficiency of manual detection methods. To address these issues, this paper proposes the RSE-YOLO model for identifying, localizing, and classifying defects in MLCC images. We design a novel backbone structure, namely the Residual Coordinate Weighted Convolutional Network, which possesses enhanced feature information extraction capabilities for accurately locating defect areas. Additionally, we introduce the Space Attention Pyramid Pooling Module to achieve weighted fusion of local and global feature information. Furthermore, the ECA-PAN is employed as the model's neck structure to facilitate the fusion of feature information at different scales, improving the model's generalization ability in multi-scale defect detection. Experimental results demonstrate that the RSE-YOLO model exhibits excellent performance on the MLCC dataset, achieving an mAP 50 of 93.9%, mAP 50-95 of 63.2%, F1 of 90.9%, and a frame rate of 57 FPS, meeting the requirements for the task of MLCC appearance defect detection. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2024.3375957 |