FIReStereo: Forest InfraRed Stereo Dataset for UAS Depth Perception in Visually Degraded Environments
Robust depth perception in visually-degraded environments is crucial for autonomous aerial systems. Thermal imaging cameras, which capture infrared radiation, are robust to visual degradation. However, due to lack of a large-scale dataset, the use of thermal cameras for unmanned aerial system (UAS)...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Robust depth perception in visually-degraded environments is crucial for
autonomous aerial systems. Thermal imaging cameras, which capture infrared
radiation, are robust to visual degradation. However, due to lack of a
large-scale dataset, the use of thermal cameras for unmanned aerial system
(UAS) depth perception has remained largely unexplored. This paper presents a
stereo thermal depth perception dataset for autonomous aerial perception
applications. The dataset consists of stereo thermal images, LiDAR, IMU and
ground truth depth maps captured in urban and forest settings under diverse
conditions like day, night, rain, and smoke. We benchmark representative stereo
depth estimation algorithms, offering insights into their performance in
degraded conditions. Models trained on our dataset generalize well to unseen
smoky conditions, highlighting the robustness of stereo thermal imaging for
depth perception. We aim for this work to enhance robotic perception in
disaster scenarios, allowing for exploration and operations in previously
unreachable areas. The dataset and source code are available at
https://firestereo.github.io. |
---|---|
DOI: | 10.48550/arxiv.2409.07715 |