Is my Depth Ground-Truth Good Enough? HAMMER -- Highly Accurate Multi-Modal Dataset for DEnse 3D Scene Regression
Depth estimation is a core task in 3D computer vision. Recent methods investigate the task of monocular depth trained with various depth sensor modalities. Every sensor has its advantages and drawbacks caused by the nature of estimates. In the literature, mostly mean average error of the depth is in...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Depth estimation is a core task in 3D computer vision. Recent methods
investigate the task of monocular depth trained with various depth sensor
modalities. Every sensor has its advantages and drawbacks caused by the nature
of estimates. In the literature, mostly mean average error of the depth is
investigated and sensor capabilities are typically not discussed. Especially
indoor environments, however, pose challenges for some devices. Textureless
regions pose challenges for structure from motion, reflective materials are
problematic for active sensing, and distances for translucent material are
intricate to measure with existing sensors. This paper proposes HAMMER, a
dataset comprising depth estimates from multiple commonly used sensors for
indoor depth estimation, namely ToF, stereo, structured light together with
monocular RGB+P data. We construct highly reliable ground truth depth maps with
the help of 3D scanners and aligned renderings. A popular depth estimators is
trained on this data and typical depth senosors. The estimates are extensively
analyze on different scene structures. We notice generalization issues arising
from various sensor technologies in household environments with challenging but
everyday scene content. HAMMER, which we make publicly available, provides a
reliable base to pave the way to targeted depth improvements and sensor fusion
approaches. |
---|---|
DOI: | 10.48550/arxiv.2205.04565 |