Bridging Domain Gap for Flight-Ready Spaceborne Vision
This work presents Spacecraft Pose Network v3 (SPNv3), a Neural Network (NN) for monocular pose estimation of a known, non-cooperative target spacecraft. As opposed to existing literature, SPNv3 is designed and trained to be computationally efficient while providing robustness to spaceborne images t...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This work presents Spacecraft Pose Network v3 (SPNv3), a Neural Network (NN)
for monocular pose estimation of a known, non-cooperative target spacecraft. As
opposed to existing literature, SPNv3 is designed and trained to be
computationally efficient while providing robustness to spaceborne images that
have not been observed during offline training and validation on the ground.
These characteristics are essential to deploying NNs on space-grade edge
devices. They are achieved through careful NN design choices, and an extensive
trade-off analysis reveals features such as data augmentation, transfer
learning and vision transformer architecture as a few of those that contribute
to simultaneously maximizing robustness and minimizing computational overhead.
Experiments demonstrate that the final SPNv3 can achieve state-of-the-art pose
accuracy on hardware-in-the-loop images from a robotic testbed while having
trained exclusively on computer-generated synthetic images, effectively
bridging the domain gap between synthetic and real imagery. At the same time,
SPNv3 runs well above the update frequency of modern satellite navigation
filters when tested on a representative graphical processing unit system with
flight heritage. Overall, SPNv3 is an efficient, flight-ready NN model readily
applicable to a wide range of close-range rendezvous and proximity operations
with target resident space objects. The code implementation of SPNv3 will be
made publicly available. |
---|---|
DOI: | 10.48550/arxiv.2409.11661 |