Optimization of 3-D Pose Measurement Method Based on Binocular Vision

To improve the accuracy and speed of existing binocular vision methods in practical environments, this article presents an optimization method for accurate and stable 3-D pose measurement based on binocular vision. First, in the stereo-image-matching stage, we introduce a guide-point definition and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2022, Vol.71, p.1-12
Hauptverfasser: Wei, Yangjie, Xi, Yao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To improve the accuracy and speed of existing binocular vision methods in practical environments, this article presents an optimization method for accurate and stable 3-D pose measurement based on binocular vision. First, in the stereo-image-matching stage, we introduce a guide-point definition and propose an optimal path-searching method for dynamic programming (DP). Second, a distance-based adaptive-filtering method is added to remove noise points and outliers around the target so that the environmental noise interference in practical environments is reduced before registration. Third, an adaptive-sampling method based on distance is proposed to extract the key points of the target point-cloud, and finally a target surface model based on a hash table is established as an offline matching data structure of point-cloud registration to improve the matching speed and accuracy of the 3-D pose calculation. A series of experiments showed the translation error of the proposed method was less than 1 cm relative to the measurement of 100 cm, and the rotation error was less than 2° relative to the measurement of 180°. Compared with traditional point-cloud registration algorithms, our method exhibited higher accuracy and stability, indicating great potential for use in unstructured environments.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2022.3149334