Advancing Industrial Object Detection Through Domain Adaptation: A Solution for Industry 5.0

Domain adaptation (DA) is essential for developing robust machine learning models capable of operating across different domains with minimal retraining. This study explores the application of domain adaptation techniques to 3D datasets for industrial object detection, with a focus on short-range and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Actuators 2024-12, Vol.13 (12), p.513
Hauptverfasser: Fatima, Zainab, Zardari, Shehnila, Tanveer, Muhammad Hassan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Domain adaptation (DA) is essential for developing robust machine learning models capable of operating across different domains with minimal retraining. This study explores the application of domain adaptation techniques to 3D datasets for industrial object detection, with a focus on short-range and long-range scenarios. While 3D data provide superior spatial information for detecting industrial parts, challenges arise due to domain shifts between training data (often clean or synthetic) and real-world conditions (noisy and occluded environments). Using the MVTec ITODD dataset, we propose a multi-level adaptation approach that leverages local and global feature alignment through PointNet-based architectures. We address sensor variability by aligning data from high-precision, long-range sensors with noisier short-range alternatives. Our results demonstrate an 85% accuracy with a minimal 0.02% performance drop, highlighting the resilience of the proposed methods. This work contributes to the emerging needs of Industry 5.0 by ensuring adaptable and scalable automation in manufacturing processes, empowering robotic systems to perform precise, reliable object detection and manipulation under challenging, real-world conditions, and supporting seamless human–robot collaboration.
ISSN:2076-0825
2076-0825
DOI:10.3390/act13120513