Complementary Fusion of Camera and LiDAR for Cooperative Object Detection and Localization in Low Contrast Environments at Night Outdoors

This study addresses the critical need for accurate outdoor object detection using multi-sensor devices in low-contrast environments at night. We focus on enhancing detection reliability by fusing camera and LiDAR data. Despite challenges like low-light conditions for cameras and low-contrast scenes...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on consumer electronics 2024-07, p.1-1
Hauptverfasser: Liang, Siyuan, Chen, Pengbo, Wu, Shengchen, Cao, Haotong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study addresses the critical need for accurate outdoor object detection using multi-sensor devices in low-contrast environments at night. We focus on enhancing detection reliability by fusing camera and LiDAR data. Despite challenges like low-light conditions for cameras and low-contrast scenes for LiDAR, our proposed MutualFusion algorithm, within the TransFusion framework, effectively tackles these issues. Employing a bimodal parallel loose coupling approach, the algorithm transforms and interacts with data from both sensors, improving semantic spatial information sharing and avoiding negative transfer. Additionally, we refine object detection by selecting sparse camera frames and integrating their sparse instance-level features with LiDAR features in 3D space. Experimental results using NuScenes on an NVIDIA GTX-3090 reveal that our MutualFusion algorithm outperforms the TransFusion method, achieving a 2% mAP increase and a 4% NDS improvement in nighttime scenes. This study demonstrates the potential of camera-LiDAR data for challenging outdoor conditions, offering valuable insights for collaborative multi-sensor tracking and localization research.
ISSN:0098-3063
1558-4127
DOI:10.1109/TCE.2024.3436852