RIDERS: Radar-Infrared Depth Estimation for Robust Sensing
Dense depth recovery is crucial in autonomous driving, serving as a foundational element for obstacle avoidance, 3D object detection, and local path planning. Adverse weather conditions, including haze, dust, rain, snow, and darkness, introduce significant challenges to accurate dense depth estimati...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Dense depth recovery is crucial in autonomous driving, serving as a
foundational element for obstacle avoidance, 3D object detection, and local
path planning. Adverse weather conditions, including haze, dust, rain, snow,
and darkness, introduce significant challenges to accurate dense depth
estimation, thereby posing substantial safety risks in autonomous driving.
These challenges are particularly pronounced for traditional depth estimation
methods that rely on short electromagnetic wave sensors, such as visible
spectrum cameras and near-infrared LiDAR, due to their susceptibility to
diffraction noise and occlusion in such environments. To fundamentally overcome
this issue, we present a novel approach for robust metric depth estimation by
fusing a millimeter-wave Radar and a monocular infrared thermal camera, which
are capable of penetrating atmospheric particles and unaffected by lighting
conditions. Our proposed Radar-Infrared fusion method achieves highly accurate
and finely detailed dense depth estimation through three stages, including
monocular depth prediction with global scale alignment, quasi-dense Radar
augmentation by learning Radar-pixels correspondences, and local scale
refinement of dense depth using a scale map learner. Our method achieves
exceptional visual quality and accurate metric estimation by addressing the
challenges of ambiguity and misalignment that arise from directly fusing
multi-modal long-wave features. We evaluate the performance of our approach on
the NTU4DRadLM dataset and our self-collected challenging ZJU-Multispectrum
dataset. Especially noteworthy is the unprecedented robustness demonstrated by
our proposed method in smoky scenarios. Our code will be released at
\url{https://github.com/MMOCKING/RIDERS}. |
---|---|
DOI: | 10.48550/arxiv.2402.02067 |