Fake Face Images Detection and Identification of Celebrities Based on Semantic Segmentation

Convolutional Neural Networks (CNN) based detectors perform well in face manipulation detection, but are still limited by redundant information. Some methods focus on blending boundary to localize manipulation regions, discarding a part of useless information like background of image. But these meth...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters 2022, Vol.29, p.2018-2022
Hauptverfasser: Wang, Renying, Yang, Zhen, You, Weike, Zhou, Linna, Chu, Beilin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Convolutional Neural Networks (CNN) based detectors perform well in face manipulation detection, but are still limited by redundant information. Some methods focus on blending boundary to localize manipulation regions, discarding a part of useless information like background of image. But these methods still contain deceptive information such as facial regions without texture, which occupies resources and affects detection accuracy. Besides, these methods left out some features useful for identification. Therefore, this paper proposes a module by conducting semantic masks to guide detectors focus on face. The semantic segmentation masks focus on the facial features such as hair, eyes and other important areas, which can offer effective face identification high level semantic features. Our method uses masks as an attention-based data augmentation module and is simple for many DeepFake detection models to integrate. Experiments on multiple detectors with and without our module show our module's effectiveness. Without modifying their structural design, our approach enables CNN-based detectors to perform better. Especially, our method is well-suited for protecting the person of interest against face forgery.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2022.3205481