Depth Augmented Omnidirectional Stereo for 6-DoF VR Photography
We present an end-to-end pipeline that enables head-motion parallax for omnidirectional stereo (ODS) panoramas. Based on an ODS panorama containing a left and right eye view, our method estimates dense horizontal disparity fields between the stereo image pair. From this, we calculate a depth augment...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Bild |
Sprache: | eng |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We present an end-to-end pipeline that enables head-motion parallax for omnidirectional stereo (ODS) panoramas. Based on an ODS panorama containing a left and right eye view, our method estimates dense horizontal disparity fields between the stereo image pair. From this, we calculate a depth augmented stereo panorama (DASP) by explicitly reconstructing the scene geometry from the viewing circle corresponding to the ODS representation. The generated DASP representation supports motion parallax within the ODS viewing circle. Our approach operates directly on existing ODS panoramas. The experiments indicate the robustness and versatility of our approach on multiple real-world ODS panoramas. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992 |
---|---|
DOI: | 10.1109/VRW50115.2020.00181 |