FACT: Fast and Active Coordinate Initialization for Vision-Based Drone Swarms

Coordinate initialization is the first step in accomplishing collaborative tasks within robot swarms, determining the quality of tasks. However, fast and robust coordinate initialization in vision-based drone swarms remains elusive. To this end, our letter proposes a complete system for initial rela...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE robotics and automation letters 2025-02, Vol.10 (2), p.931-938
Hauptverfasser: Li, Yuan, Zhao, Anke, Wang, Yingjian, Xu, Ziyi, Zhou, Xin, Xu, Chao, Zhou, Jinni, Gao, Fei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Coordinate initialization is the first step in accomplishing collaborative tasks within robot swarms, determining the quality of tasks. However, fast and robust coordinate initialization in vision-based drone swarms remains elusive. To this end, our letter proposes a complete system for initial relative pose estimation, including both relative state estimation and active planning. Particularly, our work fuses onboard visual-inertial odometry with vision-based observations generating bearing and range measurements, which are anonymous, partially mutual, and noisy. It is the first method based on convex optimization to initialize coordinates with vision-based observations. Additionally, we designed a lightweight module to actively control the movement of robots for observation acquisition and collision avoidance. With only stereo cameras and inertial measurement units as sensors, we validate the practicability of our system in simulations and real-world areas with obstacles and without signals from the Global Navigation Satellite System. Compared to methods based on local optimization and filters, our system can achieve the global optimum for coordinate initialization more stably and quickly, which is suitable for robots with size, weight, and power constraints. The source code is released for reference.
ISSN:2377-3766
DOI:10.1109/LRA.2024.3518101