Stereo Vision-Based Relative Position and Attitude Estimation of Non-Cooperative Spacecraft

In on-orbit services, the relative position and attitude estimation of non-cooperative spacecraft has become the key issues to be solved in many space missions. Because of the lack of prior knowledge about manual marks and the inability to communicate between non-cooperative space targets, the relat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Aerospace 2021-08, Vol.8 (8), p.230, Article 230
Hauptverfasser: Chang, Liang, Liu, Jixiu, Chen, Zui, Bai, Jie, Shu, Leizheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In on-orbit services, the relative position and attitude estimation of non-cooperative spacecraft has become the key issues to be solved in many space missions. Because of the lack of prior knowledge about manual marks and the inability to communicate between non-cooperative space targets, the relative position and attitude estimation system poses great challenges in terms of accuracy, intelligence, and power consumptions. To address these issues, this study uses a stereo camera to extract the feature points of a non-cooperative spacecraft. Then, the 3D position of the feature points is calculated according to the camera model to estimate the relationship. The optical flow method is also used to obtain the geometric constraint information between frames. In addition, an extended Kalman filter is used to update the measurement results and obtain more accurate pose optimization results. Moreover, we present a closed-loop simulation system, in which the Unity simulation engine is employed to simulate the relative motion of the spacecraft and binocular vision images, and a JetsonTX2 supercomputer is involved to conduct the proposed autonomous relative navigation algorithm. The simulation results show that our approach can estimate the non-cooperative target's relative pose accurately.
ISSN:2226-4310
2226-4310
DOI:10.3390/aerospace8080230