Image stitching based on angle-consistent warping
•Angle features are discovered as geometric constraints for image warping.•A stitching framework incorporating the angles is proposed to improve the alignment of images.•We reveal that angles can provide guidance for homography as coordinates do.•A novel constraint on mesh angles makes stitching res...
Gespeichert in:
Veröffentlicht in: | Pattern recognition 2021-09, Vol.117, p.107993, Article 107993 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •Angle features are discovered as geometric constraints for image warping.•A stitching framework incorporating the angles is proposed to improve the alignment of images.•We reveal that angles can provide guidance for homography as coordinates do.•A novel constraint on mesh angles makes stitching results in non-overlapping areas more natural.•The proposed framework exhibits superior performance, especially in areas with small objects or few feature points.
Many warping methods for image stitching have been proposed to construct panoramic image mosaics free of artifacts. Existing methods heavily rely on coordinate correspondences between keypoints in stitching, which may not provide adequate constraints for alignment. In this paper, we discover and employ a new constraint — angle correspondences to address the above problem. The angle of a feature point represents the local directional structure of the point, which is an extension to its position and customarily ignored in image stitching. We propose to jointly consider the coordinates as well as the angles in keypoint correspondences. Such a strategy helps to generate a correct warping in the overlapping regions of the stitched image. In addition, we propose a novel constraint — mesh angle preservation to prevent undesired distortion in non-overlapping areas. Experiments in several challenging cases demonstrate that our method yields more accurate results with significantly less artifacts in comparison with state-of-the-art methods. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2021.107993 |