Image copy-move forgery passive detection based on improved PCNN and self-selected sub-images
Image forgery detection remains a challenging problem. For the most common copy-move forgery detection, the robustness and accuracy of existing methods can still be further improved. To the best of our knowledge, we are the first to propose an image copy-move forgery passive detection method by comb...
Gespeichert in:
Veröffentlicht in: | Frontiers of Computer Science 2022-08, Vol.16 (4), p.164705, Article 164705 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Image forgery detection remains a challenging problem. For the most common copy-move forgery detection, the robustness and accuracy of existing methods can still be further improved. To the best of our knowledge, we are the first to propose an image copy-move forgery passive detection method by combining the improved pulse coupled neural network (PCNN) and the self-selected sub-images. Our method has the following steps: First, contour detection is performed on the input color image, and bounding boxes are drawn to frame the contours to form suspected forgery sub-images. Second, by improving PCNN to perform feature extraction of sub-images, the feature invariance of rotation, scaling, noise adding, and so on can be achieved. Finally, the dual feature matching is used to match the features and locate the forgery regions. What's more, the self-selected sub-images can quickly obtain suspected forgery sub-images and lessen the workload of feature extraction, and the improved PCNN can extract image features with high robustness. Through experiments on the standard image forgery datasets CoMoFoD and CASIA, it is effectively verified that the robustness score and accuracy of proposed method are much higher than the current best method, which is a more efficient image copy-move forgery passive detection method. |
---|---|
ISSN: | 2095-2228 2095-2236 |
DOI: | 10.1007/s11704-021-0450-5 |