Spiking monocular event based 6D pose estimation for space application
SPAICE2024: The First Joint European Space Agency / IAA Conference on AI in and for Space, 394-399, 2024 With the growing interest in on On-orbit servicing (OOS) and Active Debris Removal (ADR) missions, spacecraft poses estimation algorithms are being developed using deep learning to improve the pr...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | SPAICE2024: The First Joint European Space Agency / IAA Conference
on AI in and for Space, 394-399, 2024 With the growing interest in on On-orbit servicing (OOS) and Active Debris
Removal (ADR) missions, spacecraft poses estimation algorithms are being
developed using deep learning to improve the precision of this complex task and
find the most efficient solution. With the advances of bio-inspired low-power
solutions, such a spiking neural networks and event-based processing and
cameras, and their recent work for space applications, we propose to
investigate the feasibility of a fully event-based solution to improve
event-based pose estimation for spacecraft. In this paper, we address the first
event-based dataset SEENIC with real event frames captured by an event-based
camera on a testbed. We show the methods and results of the first event-based
solution for this use case, where our small spiking end-to-end network (S2E2)
solution achieves interesting results over 21cm position error and 14degree
rotation error, which is the first step towards fully event-based processing
for embedded spacecraft pose estimation. |
---|---|
DOI: | 10.48550/arxiv.2501.02916 |