Geometric and probabilistic image dissimilarity measures for common field of view detection
Detecting image pairs with a common field of view is an important prerequisite for many computer vision tasks. Typically, common local features are used as a criterion for identifying such image pairs. This approach, however, requires a reliable method for matching features, which is generally a ver...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Detecting image pairs with a common field of view is an important prerequisite for many computer vision tasks. Typically, common local features are used as a criterion for identifying such image pairs. This approach, however, requires a reliable method for matching features, which is generally a very difficult problem, especially in situations with a wide baseline or ambiguities in the scene. We propose two new approaches for the common field of view problem. The first one is still based on feature matching. Instead of requiring a very low false positive rate for the feature matching, however, geometric constraints are used to assess matches which may contain many false positives. The second approach completely avoids hard matching of features by evaluating the entropy of correspondence probabilities. We perform quantitative experiments on three different hand labeled scenes with varying difficulty. In moderately difficult situations with a medium baseline and few ambiguities in the scene, our proposed methods give similarly good results to the classical matching based method. On the most challenging scene having a wide baseline and many ambiguities, the performance of the classical method deteriorates, while ours are much less affected and still produce good results. Hence, our methods show the best overall performance in a combined evaluation. |
---|---|
ISSN: | 1063-6919 |
DOI: | 10.1109/CVPR.2009.5206810 |