Unsupervised cycle optimization learning for single-view depth and camera pose with Kalman filter
This paper presents a general cycle optimization framework with a Kalman filter (KF) module for single-view depth prediction and camera pose estimation. The framework designs a KF module based on measurement noise estimated from networks without supervision to reduce the noise of pose parameters and...
Gespeichert in:
Veröffentlicht in: | Engineering applications of artificial intelligence 2021-11, Vol.106, p.104488, Article 104488 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents a general cycle optimization framework with a Kalman filter (KF) module for single-view depth prediction and camera pose estimation. The framework designs a KF module based on measurement noise estimated from networks without supervision to reduce the noise of pose parameters and optimizes the DepthNet architecture to add a new upconvolutional module and a decoder structure to overcome the gradient locality and adjust the mode of multi-task coupling. All modules are integrated to construct a cycle optimization strategy as the core of this paper for overall performance improvement. Experimental results on the KITTI dataset show that the cycle optimization framework greatly improves the performance of the original framework and is better than other improvements on the same original framework; single-view depth prediction and camera pose estimation achieve state-of-the-art performance compared with existing methods under the same or comparable structure.
•A cycle optimization framework is used to estimate single-view depth and camera pose.•Kalman Filter, as an activation module, enhances performances of all tasks.•Noise is estimated by using neural networks without supervision in unknown scenes.•The weights in feature maps are optimized to overcome the gradient locality. |
---|---|
ISSN: | 0952-1976 1873-6769 |
DOI: | 10.1016/j.engappai.2021.104488 |