An Occlusion and Noise-Aware Stereo Framework Based on Light Field Imaging for Robust Disparity Estimation

Stereo vision is widely studied for depth information extraction. However, occlusion and noise pose significant challenges to traditional methods due to failure in photo consistency. In this paper, an occlusion and noise-aware stereo framework named ONAF is proposed to get a robust depth estimation...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on computers 2024-03, Vol.73 (3), p.764-777
Hauptverfasser: Yang, Da, Cui, Zhenglong, Sheng, Hao, Chen, Rongshan, Cong, Ruixuan, Wang, Shuai, Xiong, Zhang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Stereo vision is widely studied for depth information extraction. However, occlusion and noise pose significant challenges to traditional methods due to failure in photo consistency. In this paper, an occlusion and noise-aware stereo framework named ONAF is proposed to get a robust depth estimation by integrating the advantages of correspondence cues and refocusing cues from light field (LF). ONAF consists of two special depth cue extractors: correspondence depth cue extractor (CCE) and refocusing depth cue extractor (RCE). CCE extracts accurate correspondence depth cues in occlusion areas based on multi-direction Ray-Epipolar Plane Images (Ray-EPIs) from LF, which are more robust than traditional multi-direction EPIs. RCE generates accurate refocusing depth cues in noise areas, benefitting from the many-to-one integration strategy and the directional perception of texture and occlusion based on multi-direction focal stacks from LF. Attention mechanism is introduced to complementarily fuse CCE and RCE to generate optimum depth maps. The experimental results prove the effectiveness of ONAF, which outperforms state-of-the-art disparity estimation methods, especially in occlusion and noise areas.
ISSN:0018-9340
1557-9956
DOI:10.1109/TC.2023.3343098