Deepfake Labels Restore Reality, Especially for Those Who Dislike the Speaker
Deepfake videos create dangerous possibilities for public misinformation. In this experiment (N=204), we investigated whether labeling videos as containing actual or deepfake statements from US President Biden helps participants later differentiate between true and fake information. People accuratel...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deepfake videos create dangerous possibilities for public misinformation. In
this experiment (N=204), we investigated whether labeling videos as containing
actual or deepfake statements from US President Biden helps participants later
differentiate between true and fake information. People accurately recalled
93.8% of deepfake videos and 84.2% of actual videos, suggesting that labeling
videos can help combat misinformation. Individuals who identify as Republican
and had lower favorability ratings of Biden performed better in distinguishing
between actual and deepfake videos, a result explained by the elaboration
likelihood model (ELM), which predicts that people who distrust a message
source will more critically evaluate the message. |
---|---|
DOI: | 10.48550/arxiv.2404.17581 |