Sensor Fusion for Autonomous Indoor UAV Navigation in Confined Spaces

In this paper, we address the challenge of navigating through unknown indoor environments using autonomous aerial robots within confined spaces. The core of our system involves the integration of key sensor technologies, including depth sensing from the ZED 2i camera, IMU data, and LiDAR measurement...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-10
Hauptverfasser: James, Alice, Avishkar Seth, Kuantama, Endrowednes, Mukhopadhyay, Subhas, Han, Richard
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we address the challenge of navigating through unknown indoor environments using autonomous aerial robots within confined spaces. The core of our system involves the integration of key sensor technologies, including depth sensing from the ZED 2i camera, IMU data, and LiDAR measurements, facilitated by the Robot Operating System (ROS) and RTAB-Map. Through custom designed experiments, we demonstrate the robustness and effectiveness of this approach. Our results showcase a promising navigation accuracy, with errors as low as 0.4 meters, and mapping quality characterized by a Root Mean Square Error (RMSE) of just 0.13 m. Notably, this performance is achieved while maintaining energy efficiency and balanced resource allocation, addressing a crucial concern in UAV applications. Flight tests further underscore the precision of our system in maintaining desired flight orientations, with a remarkable error rate of only 0.1%. This work represents a significant stride in the development of autonomous indoor UAV navigation systems, with potential applications in search and rescue, facility inspection, and environmental monitoring within GPS-denied indoor environments.
ISSN:2331-8422
DOI:10.48550/arxiv.2410.20599