A pose estimation method of space non-cooperative target based on ORBFPFH SLAM
In this paper, to improve the pose measurement accuracy of the Time-of-Flight (ToF) camera for space non-cooperative targets, a pose estimation method based on ORBFPFH Simultaneous Localization and Mapping (SLAM) with effective integration of strength and depth measurement information is proposed. T...
Gespeichert in:
Veröffentlicht in: | Optik (Stuttgart) 2023-09, Vol.286, p.171025, Article 171025 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, to improve the pose measurement accuracy of the Time-of-Flight (ToF) camera for space non-cooperative targets, a pose estimation method based on ORBFPFH Simultaneous Localization and Mapping (SLAM) with effective integration of strength and depth measurement information is proposed. The primary process of the method is as follows: First, a unique ORBFPFH Bag of Words (BoW) model for space targets is trained based on the ToF image dataset of spacecraft; Secondly, the pose of the non-cooperative space target is tracked based on the ORBFPFH feature and optimized using the pose graph; Then, based on the ORBFPFH BoW model, loop closure is detected to reduce the cumulative error. Finally, the proposed method is tested on the ToF image dataset: compared with the advanced ORB-SLAM2 algorithm, the proposed method has a minor translation error and rotation error on the test data, with a mean translation error less than 0.144 m and mean rotation error less than 0.642°. The test results show that the proposed method can improve the pose estimation accuracy of space non-cooperative targets, achieve a better 3D point cloud reconstruction effect, and provide technical support for space applications such as rendezvous and docking of non-cooperative spacecraft. © 2001 Elsevier Science. All rights reserved |
---|---|
ISSN: | 0030-4026 1618-1336 |
DOI: | 10.1016/j.ijleo.2023.171025 |