Bias Perpetuates Bias: ChatGPT Learns Gender Inequities in Academic Surgery Promotions

•Implicit bias is well documented in medicine, specifically surgery.•Implicit bias, specifically gender inequities, have been previously seen in letters of recommendation.•Artificial intelligence large language models, such as chatGPT, echo existing gender inequities when asked to write letters of r...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of surgical education 2024-11, Vol.81 (11), p.1553-1557
Hauptverfasser: Desai, Pooja, Wang, Hao, Davis, Lindy, Ullmann, Timothy M., DiBrito, Sandra R.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Implicit bias is well documented in medicine, specifically surgery.•Implicit bias, specifically gender inequities, have been previously seen in letters of recommendation.•Artificial intelligence large language models, such as chatGPT, echo existing gender inequities when asked to write letters of recommendation for academic promotion in surgery. Gender inequities persist in academic surgery with implicit bias impacting hiring and promotion at all levels. We hypothesized that creating letters of recommendation for both female and male candidates for academic promotion in surgery using an AI platform, ChatGPT, would elucidate the entrained gender biases already present in the promotion process. Using ChatGPT, we generated 6 letters of recommendation for “a phenomenal surgeon applying for job promotion to associate professor position”, specifying “female” or “male” before surgeon in the prompt. We compared 3 “female” letters to 3 “male” letters for differences in length, language, and tone. The letters written for females averaged 298 words compared to 314 for males. Female letters more frequently referred to “compassion”, “empathy”, and “inclusivity”; whereas male letters referred to “respect”, “reputation”, and “skill”. These findings highlight the gender bias present in promotion letters generated by ChatGPT, reiterating existing literature regarding real letters of recommendation in academic surgery. Our study suggests that surgeons should use AI tools, such as ChatGPT, with caution when writing LORs for academic surgery faculty promotion.
ISSN:1931-7204
1878-7452
1878-7452
DOI:10.1016/j.jsurg.2024.07.023