CSC-Net: Cross-Color Spatial Co-Occurrence Matrix Network for Detecting Synthesized Fake Images

Recently, the generative adversarial networks (GANs) generated images have been spread over the social networks, which brings the new challenge in the community of media forensics. Although some reliable forensic tools have advanced the study of detecting GAN generated images, while the detection ac...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on cognitive and developmental systems 2024-02, Vol.16 (1), p.369-379
Hauptverfasser: Qiao, Tong, Chen, Yuxing, Zhou, Xiaofei, Shi, Ran, Shao, Hang, Shen, Kunye, Luo, Xiangyang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, the generative adversarial networks (GANs) generated images have been spread over the social networks, which brings the new challenge in the community of media forensics. Although some reliable forensic tools have advanced the study of detecting GAN generated images, while the detection accuracy cannot be guaranteed when facing the malicious post-processing attacks, especially in the practical social network scenario. Thus, in this context, we propose a novel well-designed deep neural network equipped with handcrafted features for dealing with this problem. In particular, relying on the cross-color spatial co-occurrence matrix (CSCM), the discriminative features are extracted after carefully analyzing and selecting the most effective color channels. Next, the fused features are fed into the deep neutral network for training a high-efficient forensic detector. Extensive experimental results empirically verify that in most detection scenarios, our proposed detector performs superiorly to the prior arts, especially in the case of post-processing attacks. Moreover, we also highlight the relevance of the proposed detector over the realistic social network platforms, and its generalization capability in three different scenarios.
ISSN:2379-8920
2379-8939
DOI:10.1109/TCDS.2023.3274450