GAMA: Geometric analysis based motion-aware architecture for moving object segmentation
Moving object segmentation in real-world scenes is of critical significance for many computer vision applications. However, there are many challenges in moving object segmentation. It is difficult to distinguish objects with motion degeneracy. Besides, complex scenes and noisy 2D optical flows also...
Gespeichert in:
Veröffentlicht in: | Computer vision and image understanding 2023-09, Vol.234, p.103751, Article 103751 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Moving object segmentation in real-world scenes is of critical significance for many computer vision applications. However, there are many challenges in moving object segmentation. It is difficult to distinguish objects with motion degeneracy. Besides, complex scenes and noisy 2D optical flows also effect the result of moving object segmentation. In this paper, to address difficulties caused by motion degeneracy, we analyze the classic motion degeneracy from a new geometric perspective. To identify objects with motion degeneracy, we propose a reprojection cost and an optical flow contrast cost which are fed into the network to enrich motion features. Furthermore, a novel geometric constraint called bidirectional motion constraint is proposed to detect moving objects with weak motion features. In order to tackle more complex scenes, we also introduce a motion-aware architecture to predict instance masks of moving objects. Extensive experiments are conducted on the KITTI dataset, the JNU-UISEE dataset and the KittiMoSeg dataset, and our proposed method achieves excellent performance.
•We design different motion costs to deal with problems caused by motion degeneracy.•We propose a bidirectional motion constraint to identify objects with weak motions.•We introduce a geometric analysis based motion-aware architecture. |
---|---|
ISSN: | 1077-3142 1090-235X |
DOI: | 10.1016/j.cviu.2023.103751 |