Exploring Chromatic Aberration and Defocus Blur for Relative Depth Estimation From Monocular Hyperspectral Image
This article investigates spectral chromatic and spatial defocus aberration in a monocular hyperspectral image (HSI) and proposes methods on how these cues can be utilized for relative depth estimation. The main aim of this work is to develop a framework by exploring intrinsic and extrinsic reflecta...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on image processing 2021, Vol.30, p.4357-4370 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This article investigates spectral chromatic and spatial defocus aberration in a monocular hyperspectral image (HSI) and proposes methods on how these cues can be utilized for relative depth estimation. The main aim of this work is to develop a framework by exploring intrinsic and extrinsic reflectance properties in HSI that can be useful for depth estimation. Depth estimation from a monocular image is a challenging task. An additional level of difficulty is added due to low resolution and noises in hyperspectral data. Our contribution to handling depth estimation in HSI is threefold. Firstly, we propose that change in focus across band images of HSI due to chromatic aberration and band-wise defocus blur can be integrated for depth estimation. Novel methods are developed to estimate sparse depth maps based on different integration models. Secondly, by adopting manifold learning, an effective objective function is developed to combine all sparse depth maps into a final optimized sparse depth map. Lastly, a new dense depth map generation approach is proposed, which extrapolate sparse depth cues by using material-based properties on graph Laplacian. Experimental results show that our methods successfully exploit HSI properties to generate depth cues. We also compare our method with state-of-the-art RGB image-based approaches, which shows that our methods produce better sparse and dense depth maps than those from the benchmark methods. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2021.3071682 |