Occlusion Is Underrated: An Occlusion- Attention Strategy Assembled in 3-D Object Detectors
LiDAR sensors provide rich geometrical information for 3-D scene understanding, which has been widely used as a unique input for 3-D object detection. However, due to the intrinsic property, point clouds scanned by LiDAR are always sparse and incomplete, and objects are occluded to different extents...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2024-05, Vol.24 (10), p.16502-16509 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | LiDAR sensors provide rich geometrical information for 3-D scene understanding, which has been widely used as a unique input for 3-D object detection. However, due to the intrinsic property, point clouds scanned by LiDAR are always sparse and incomplete, and objects are occluded to different extents, which will deteriorate the detection accuracy. The existing methods overlook occlusion or tackle occlusion implicitly. In this article, we emphasize the universality of occlusion in point clouds and propose a novel occlusion-attention strategy, which aims to increase model's sensitivity to occlusion and maintain great performance in occlude scenes. The proposed method simulates different types and levels of occlusion and explores the relationship between the uncertainty caused by occlusion and the prediction distribution. The major changes include the following: 1) data augmentation specifically for occlusion scenes to force feature extractor into learning efficient features regardless of damage and 2) uncertainty estimation module to model prediction as a distribution instead of the deterministic label. We incorporate the proposed methods into various classical 3-D base detectors and demonstrate performance gain in the KITTI dataset, which proves the particularity of occlusion structure and the necessity of uncertainty estimation. |
---|---|
ISSN: | 1530-437X 1558-1748 |
DOI: | 10.1109/JSEN.2024.3384401 |