The Longtail Impact of Generative AI on Disinformation: Harmonizing Dichotomous Perspectives

Generative AI (GenAI) poses significant risks in creating convincing yet factually ungrounded content, particularly in “longtail” contexts of high-impact events and resource-limited settings. While some argue that current disinformation ecosystems naturally limit GenAI’s impact, we contend that this...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE intelligent systems 2024-09, Vol.39 (5), p.12-19
Hauptverfasser: Lucas, Jason S., Maung, Barani Maung, Tabar, Maryam, McBride, Keegan, Lee, Dongwon, Murugesan, San
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Generative AI (GenAI) poses significant risks in creating convincing yet factually ungrounded content, particularly in “longtail” contexts of high-impact events and resource-limited settings. While some argue that current disinformation ecosystems naturally limit GenAI’s impact, we contend that this perspective neglects longtail contexts where disinformation consequences are most profound. This article analyzes the potential impact of GenAI’s disinformation in longtail events and settings, focusing on 1) quantity: its ability to flood information ecosystems during critical events; 2) quality: the challenge of distinguishing authentic content from high-quality GenAI content; 3) personalization: its capacity for precise microtargeting exploiting individual vulnerabilities; and 4) hallucination: the danger of unintentional false information generation, especially in high-stakes situations. We then propose strategies to combat disinformation in these contexts. Our analysis underscores the need for proactive measures to mitigate risks, safeguard social unity, and combat the erosion of trust in the GenAI era, particularly in vulnerable communities and during critical events.
ISSN:1541-1672
1941-1294
DOI:10.1109/MIS.2024.3439109