End-to-end Monocular Pose Estimation for Uncooperative Spacecraft based on Direct Regression Network

Deep learning shows good performance in monocular pose estimation and has been used by some space researchers to solve the monocular pose estimation problem of uncooperative spacecraft. However, existing deep learning-based methods are mostly trained with keypoint regression errors unnecessarily ref...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on aerospace and electronic systems 2023-10, Vol.59 (5), p.1-13
Hauptverfasser: Huang, Haoran, Song, Bin, Zhao, Gaopeng, Bo, Yuming
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep learning shows good performance in monocular pose estimation and has been used by some space researchers to solve the monocular pose estimation problem of uncooperative spacecraft. However, existing deep learning-based methods are mostly trained with keypoint regression errors unnecessarily reflecting actual pose errors, limiting their learning performance. In this paper, an end-to-end pose estimation network based on the convolutional neural network (CNN) is proposed for the uncooperative spacecraft. First, we design a keypoint regression sub-network based on the multi-branch structure to regress the 2D keypoint locations. Then, we propose a pose estimation sub-network to estimate the pose of the target spacecraft from the predicted 2D keypoints and the corresponding 3D keypoints of the target model, which allows the end-to-end training of the overall pose estimation network with actual pose error. The experimental results on two public datasets demonstrate that the proposed method can accurately estimate the target spacecraft pose in the presence of scale variance and dynamic Earth background and has better pose estimation accuracy than the current state-of-the-art methods. In addition, the proposed method shows good generalization performance and near real-time efficiency.
ISSN:0018-9251
1557-9603
DOI:10.1109/TAES.2023.3256971