Realization of Convergent Binocular Vision Algorithm for Guiding Robot Localization

The machining quality of large and complex components highly depends on the visually-guided robot localization accuracy.Aiming at the problems of limited view field of existing parallel binocular cameras and complex scanning-reconstruction algorithms, a convergent binocular stereo vision algorithm i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Ji xie gong cheng xue bao 2022, Vol.58 (14), p.161
Hauptverfasser: Liu, Hongdi, Lü, Rui, Tian, Linli, Zhu, Dahu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The machining quality of large and complex components highly depends on the visually-guided robot localization accuracy.Aiming at the problems of limited view field of existing parallel binocular cameras and complex scanning-reconstruction algorithms, a convergent binocular stereo vision algorithm is proposed to guide robot for precise localization. Firstly, both small hole imaging principle and triangle similarity are used to establish the imaging model of convergent binocular camera for the realization of autonomous calibration, feature matching, and extraction of the spatial position information of mark points. Secondly, the data is converted to the robot base coordinate system through the hand-eye calibration algorithm for correcting the three-dimensional model of the work-piece. Thirdly, the motion path is automatically planed to guide the robot for accurate localization. Two work-pieces,regular ceramic block and complex rail, are used for experimental verification. The results show that the average localization error of work-pieces placed in any posture within the field of view can be controlled within 4%, showing the advantages of high precision and fast running speed than the typical binocular vision algorithms, and has good application prospect.
ISSN:0577-6686
DOI:10.3901/JME.2022.14.161