Informative Artifacts in AI-Assisted Care
To the Editor: Ferryman et al. (Aug. 31 issue)1 acknowledge that the entire health care system suffers from the absence of data on race and ethnicity, particularly for underserved populations. Artificial intelligence (AI) applications that are trained on such health care data sets are inherently bia...
Gespeichert in:
Veröffentlicht in: | The New England journal of medicine 2023-11, Vol.389 (22), p.2113-2115 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To the Editor: Ferryman et al. (Aug. 31 issue)1 acknowledge that the entire health care system suffers from the absence of data on race and ethnicity, particularly for underserved populations. Artificial intelligence (AI) applications that are trained on such health care data sets are inherently biased and likely to accentuate widening health inequities for underrepresented racial and ethnic groups.2,3 When algorithmic bias aligns with current manifestations of injustice, skewed AI tools will lead to greater inequity and discrimination.1-3 The proposal by Ferryman et al.1 that AI-generated patterns be considered as artifacts that provide insight into the societies and institutions that . . . |
---|---|
ISSN: | 0028-4793 1533-4406 |
DOI: | 10.1056/NEJMc2311525 |