Multiscale and multidirection depth map super resolution with semantic inference
Depth map super resolution has been paid much attention in 3D applications due to the limitation of depth sensors. Few textures in objects with clear contours along them is the most important characteristic of depth map. An efficient image representation should be directional, multiscale and anisotr...
Gespeichert in:
Veröffentlicht in: | IET Image Processing 2023-11, Vol.17 (13), p.3670-3687 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Depth map super resolution has been paid much attention in 3D applications due to the limitation of depth sensors. Few textures in objects with clear contours along them is the most important characteristic of depth map. An efficient image representation should be directional, multiscale and anisotropic. From this we propose a novel multiscale and multidirection depth map super resolution framework with semantic inference to improve the quality of depth maps. In this framework, a multiscale and multidirection depth map contour fusion scheme captures and assembles intrinsic geometrical structures through a multiview non‐subsampled contourlet transform manner. This scheme not only isolates the discontinuities of contours but retains the smoothness along the contours. The semantic inference is also utilized to segment and label the depth map into objects/backgrounds‐level which are coplanar. Furthermore, a semantic‐aware label refinement strategy is introduced to correct the rarely inaccurate labels of the label map for upscaling the target pixel with pixels in the same object or background. Experimental results on benchmark depth map dataset demonstrate that the proposed multiscale and multidirection depth map super resolution framework with semantic inference has a significant improvement than the state‐of‐the‐art algorithms both visually and quantitatively. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/ipr2.12877 |