Editorial Commentary: At Present, ChatGPT Cannot Be Relied Upon to Answer Patient Questions and Requires Physician Expertise to Interpret Answers for Patients
ChatGPT is designed to provide accurate and reliable information to the best of its abilities based on the data input and knowledge available. Thus, ChatGPT is being studied as a patient information tool. This artificial intelligence (AI) tool has been shown to frequently provide technically correct...
Gespeichert in:
Veröffentlicht in: | Arthroscopy 2024-07, Vol.40 (7), p.2080-2082 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | ChatGPT is designed to provide accurate and reliable information to the best of its abilities based on the data input and knowledge available. Thus, ChatGPT is being studied as a patient information tool. This artificial intelligence (AI) tool has been shown to frequently provide technically correct information but with limitations. ChatGPT provides different answers to similar questions based on the prompts, and patients may not have expertise in prompting ChatGPT to elicit a best answer. (Prompting large language models has been shown to be a skill that can improve.) Of greater concern, ChatGPT fails to provide sources or references for its answers. At present, ChatGPT cannot be relied upon to address patient questions; in the future, ChatGPT will improve. Today, AI requires physician expertise to interpret AI answers for patients. |
---|---|
ISSN: | 0749-8063 1526-3231 1526-3231 |
DOI: | 10.1016/j.arthro.2024.02.039 |