An Improved Feature Pyramid Network and Metric Learning Approach for Rail Surface Defect Detection
When deep learning methods are used to detect rail surface defects, the training accuracy declines due to small defects and an insufficient number of samples. This paper investigates the problem of rail surface defect detection by using an improved feature pyramid network (FPN) and the metric learni...
Gespeichert in:
Veröffentlicht in: | Applied sciences 2023-05, Vol.13 (10), p.6047 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | When deep learning methods are used to detect rail surface defects, the training accuracy declines due to small defects and an insufficient number of samples. This paper investigates the problem of rail surface defect detection by using an improved feature pyramid network (FPN) and the metric learning approach. Firstly, the FPN is improved by adding deformable convolution and convolutional block attention modules to improve the accuracy of detecting defects of different scales, and it is pretrained on the MS COCO dataset. Secondly, a new model is established to extract network features based on the transfer learning model and learned network parameters. Thirdly, a multimodal network structure is constructed, and the distance between each modal representative and the embedded feature vector is calculated to classify the defects. Finally, experiments are carried out on the miniImageNet dataset and the rail surface defect dataset. The results show that the mAP (five-way five-shot) of our method is 73.42% on the miniImageNet dataset and 63.29% on the rail defect dataset. Our experiments show the effectiveness of the proposed method, and the results of the rail surface defect detection are satisfactory. As there are few sample classification studies of rail surface defects, this work provides a different approach and lays a foundation for further research. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app13106047 |