An Online Rail Track Fastener Classification System Based on YOLO Models

In order to save manpower on rail track inspection, computer vision-based methodologies are developed. We propose utilizing the YOLOv4-Tiny neural network to identify track defects in real time. There are ten defects covering fasteners, rail surfaces, and sleepers from the upward and six defects abo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2022-12, Vol.22 (24), p.9970
Hauptverfasser: Hsieh, Chen-Chiung, Hsu, Ti-Yun, Huang, Wei-Hsin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In order to save manpower on rail track inspection, computer vision-based methodologies are developed. We propose utilizing the YOLOv4-Tiny neural network to identify track defects in real time. There are ten defects covering fasteners, rail surfaces, and sleepers from the upward and six defects about the rail waist from the sideward. The proposed real-time inspection system includes a high-performance notebook, two sports cameras, and three parallel processes. The hardware is mounted on a flat cart running at 30 km/h. The inspection results about the abnormal track components could be queried by defective type, time, and the rail hectometer stake. In the experiments, data augmentation by a Cycle Generative Adversarial Network (GAN) is used to increase the dataset. The number of images is 3800 on the upward and 967 on the sideward. Five object detection neural network models-YOLOv4, YOLOv4-Tiny, YOLOX-Tiny, SSD512, and SSD300-were tested. The YOLOv4-Tiny model with 150 FPS is selected as the recognition kernel, as it achieved 91.7%, 92%, and 91% for the mAP, precision, and recall of the defective track components from the upward, respectively. The mAP, precision, and recall of the defective track components from the sideward are 99.16%, 96%, and 94%, respectively.
ISSN:1424-8220
1424-8220
DOI:10.3390/s22249970