MUN-FRL: A Visual-Inertial-LiDAR Dataset for Aerial Autonomous Navigation and Mapping

This paper presents a unique outdoor aerial visual-inertial-LiDAR dataset captured using a multi-sensor payload to promote the global navigation satellite system (GNSS)-denied navigation research. The dataset features flight distances ranging from 300 m to 5 km, collected using a DJI-M600 hexacopter...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The International journal of robotics research 2024-10, Vol.43 (12), p.1853-1866
Hauptverfasser: Thalagala, Ravindu G, De Silva, Oscar, Jayasiri, Awantha, Gubbels, Arthur, Mann, George KI, Gosine, Raymond G
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a unique outdoor aerial visual-inertial-LiDAR dataset captured using a multi-sensor payload to promote the global navigation satellite system (GNSS)-denied navigation research. The dataset features flight distances ranging from 300 m to 5 km, collected using a DJI-M600 hexacopter drone and the National Research Council (NRC) Bell412 Advanced Systems Research Aircraft (ASRA). The dataset consists of hardware-synchronized monocular images, inertial measurement unit (IMU) measurements, 3D light detection and ranging (LiDAR) point-clouds, and high-precision real-time kinematic (RTK)-GNSS based ground truth. Nine data sequences were collected as robot operating system (ROS) bags over 100 mins of outdoor environment footage ranging from urban areas, highways, airports, hillsides, prairies, and waterfronts. The dataset was collected to facilitate the development of visual-inertial-LiDAR odometry and mapping algorithms, visual-inertial navigation algorithms, object detection, segmentation, and landing zone detection algorithms based on real-world drone and full-scale helicopter data. All the data sequences contain raw sensor measurements, hardware timestamps, and spatio-temporally aligned ground truth. The intrinsic and extrinsic calibrations of the sensors are also provided, along with raw calibration datasets. A performance summary of state-of-the-art methods applied on the data sequences is also provided.
ISSN:0278-3649
1741-3176
DOI:10.1177/02783649241238358