Facial deepfake performance evaluation based on three detection tools: MTCNN, Dlib, and MediaPipe
DeepFake, DeepFake detection, and face detection technologies have a strong relationship. DeepFake techniques have become a serious threat to celebrities, the general public, and the judiciary, which relies on visual media as evidence in criminal cases. DeepFake detection methods are responsible for...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | DeepFake, DeepFake detection, and face detection technologies have a strong relationship. DeepFake techniques have become a serious threat to celebrities, the general public, and the judiciary, which relies on visual media as evidence in criminal cases. DeepFake detection methods are responsible for detecting DeepFake through a series of procedures, the first and most significant of which is face detection. Note that face detection approaches are used in several methods to detect faces in photos, including those in DeepFake generating and DeepFake detection. Hence, the first part of this paper describes the concept of face detection and three of its tools, two of which are frequently utilized in DeepFake detection (Multi-Task Cascaded Convolutional Networks (MTCNN) and Dlib). MediaPipe has not yet been employed in this field to study and evaluate the performance of these tools through a practical comparison. Consequently, a group of modern DeepFake detection methods proposes a new taxonomy based on extracting the features utilized in each. Lastly, datasets of photos from DeepFake detection experiments were used to determine which one posed a genuine challenge to the three tools. The results presented that the MediaPipe tool is the best in accuracy, 99.3%, and the dataset Open Forensic (OF) is the most challenging due to the many mistakes generated when its works on it. |
---|---|
ISSN: | 0094-243X 1551-7616 |
DOI: | 10.1063/5.0213294 |