ChatGPT’s inconsistent moral advice influences users’ judgment

ChatGPT is not only fun to chat with, but it also searches information, answers questions, and gives advice. With consistent moral advice, it can improve the moral judgment and decisions of users. Unfortunately, ChatGPT’s advice is not consistent. Nonetheless, it does influence users’ moral judgment...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific reports 2023-04, Vol.13 (1), p.4569-4569, Article 4569
Hauptverfasser: Krügel, Sebastian, Ostermaier, Andreas, Uhl, Matthias
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:ChatGPT is not only fun to chat with, but it also searches information, answers questions, and gives advice. With consistent moral advice, it can improve the moral judgment and decisions of users. Unfortunately, ChatGPT’s advice is not consistent. Nonetheless, it does influence users’ moral judgment, we find in an experiment, even if they know they are advised by a chatting bot, and they underestimate how much they are influenced. Thus, ChatGPT corrupts rather than improves its users’ moral judgment. While these findings call for better design of ChatGPT and similar bots, we also propose training to improve users’ digital literacy as a remedy. Transparency, however, is not sufficient to enable the responsible use of AI.
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-023-31341-0