Informative Artifacts in AI-Assisted Care

To the Editor: Ferryman et al. (Aug. 31 issue)1 acknowledge that the entire health care system suffers from the absence of data on race and ethnicity, particularly for underserved populations. Artificial intelligence (AI) applications that are trained on such health care data sets are inherently bia...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The New England journal of medicine 2023-11, Vol.389 (22), p.2113-2115
Hauptverfasser: Azizi, Zahra, Vedelli, Jordan K H, Anand, Kanwaljeet J S
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 2115
container_issue 22
container_start_page 2113
container_title The New England journal of medicine
container_volume 389
creator Azizi, Zahra
Vedelli, Jordan K H
Anand, Kanwaljeet J S
description To the Editor: Ferryman et al. (Aug. 31 issue)1 acknowledge that the entire health care system suffers from the absence of data on race and ethnicity, particularly for underserved populations. Artificial intelligence (AI) applications that are trained on such health care data sets are inherently biased and likely to accentuate widening health inequities for underrepresented racial and ethnic groups.2,3 When algorithmic bias aligns with current manifestations of injustice, skewed AI tools will lead to greater inequity and discrimination.1-3 The proposal by Ferryman et al.1 that AI-generated patterns be considered as artifacts that provide insight into the societies and institutions that . . .
doi_str_mv 10.1056/NEJMc2311525
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2898312166</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2898312161</sourcerecordid><originalsourceid>FETCH-LOGICAL-c525t-ccf63328e761157f6cffba89342dcd708b540ba572e6353720700677c954bf403</originalsourceid><addsrcrecordid>eNqN0UtLw0AQB_BFFFurN88S8KJgdPa9OYZSNVL1ouew2exCSh51NxH89qa0injqXObyYx78ETrHcIuBi7uXxdOzIRRjTvgBmmJOacwYiEM0BSAqZjKhE3QSwgrGwiw5RhOqgCkCbIqus9Z1vtF99Wmj1PeV06YPUdVGaRanIVSht2U0196eoiOn62DPdn2G3u8Xb_PHePn6kM3TZWzGA_rYGCcoJcpKMZ4knTDOFVollJHSlBJUwRkUmktiBeVUEpAAQkqTcFY4BnSGrrZz1777GGzo86YKxta1bm03hJyoRFFMsBD7U7w35clIL__RVTf4dvx5ozgVBMNm981WGd-F4K3L175qtP_KMeSbYPK_wYz8Yjd0KBpb_uKfJOg3y9qDgA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2895362106</pqid></control><display><type>article</type><title>Informative Artifacts in AI-Assisted Care</title><source>MEDLINE</source><source>EZB-FREE-00999 freely available EZB journals</source><source>ProQuest Central UK/Ireland</source><source>New England Journal of Medicine</source><creator>Azizi, Zahra ; Vedelli, Jordan K H ; Anand, Kanwaljeet J S</creator><creatorcontrib>Azizi, Zahra ; Vedelli, Jordan K H ; Anand, Kanwaljeet J S</creatorcontrib><description>To the Editor: Ferryman et al. (Aug. 31 issue)1 acknowledge that the entire health care system suffers from the absence of data on race and ethnicity, particularly for underserved populations. Artificial intelligence (AI) applications that are trained on such health care data sets are inherently biased and likely to accentuate widening health inequities for underrepresented racial and ethnic groups.2,3 When algorithmic bias aligns with current manifestations of injustice, skewed AI tools will lead to greater inequity and discrimination.1-3 The proposal by Ferryman et al.1 that AI-generated patterns be considered as artifacts that provide insight into the societies and institutions that . . .</description><identifier>ISSN: 0028-4793</identifier><identifier>EISSN: 1533-4406</identifier><identifier>DOI: 10.1056/NEJMc2311525</identifier><identifier>PMID: 38048204</identifier><language>eng</language><publisher>United States: Massachusetts Medical Society</publisher><subject>Artificial Intelligence ; Bias ; Conflicts of interest ; Health care ; Health disparities ; Humans ; Minority &amp; ethnic groups ; Patient safety</subject><ispartof>The New England journal of medicine, 2023-11, Vol.389 (22), p.2113-2115</ispartof><rights>Copyright © 2023 Massachusetts Medical Society. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c525t-ccf63328e761157f6cffba89342dcd708b540ba572e6353720700677c954bf403</citedby><cites>FETCH-LOGICAL-c525t-ccf63328e761157f6cffba89342dcd708b540ba572e6353720700677c954bf403</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2895362106?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,780,784,2759,27924,27925,64385,64387,64389,72469</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38048204$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Azizi, Zahra</creatorcontrib><creatorcontrib>Vedelli, Jordan K H</creatorcontrib><creatorcontrib>Anand, Kanwaljeet J S</creatorcontrib><title>Informative Artifacts in AI-Assisted Care</title><title>The New England journal of medicine</title><addtitle>N Engl J Med</addtitle><description>To the Editor: Ferryman et al. (Aug. 31 issue)1 acknowledge that the entire health care system suffers from the absence of data on race and ethnicity, particularly for underserved populations. Artificial intelligence (AI) applications that are trained on such health care data sets are inherently biased and likely to accentuate widening health inequities for underrepresented racial and ethnic groups.2,3 When algorithmic bias aligns with current manifestations of injustice, skewed AI tools will lead to greater inequity and discrimination.1-3 The proposal by Ferryman et al.1 that AI-generated patterns be considered as artifacts that provide insight into the societies and institutions that . . .</description><subject>Artificial Intelligence</subject><subject>Bias</subject><subject>Conflicts of interest</subject><subject>Health care</subject><subject>Health disparities</subject><subject>Humans</subject><subject>Minority &amp; ethnic groups</subject><subject>Patient safety</subject><issn>0028-4793</issn><issn>1533-4406</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqN0UtLw0AQB_BFFFurN88S8KJgdPa9OYZSNVL1ouew2exCSh51NxH89qa0injqXObyYx78ETrHcIuBi7uXxdOzIRRjTvgBmmJOacwYiEM0BSAqZjKhE3QSwgrGwiw5RhOqgCkCbIqus9Z1vtF99Wmj1PeV06YPUdVGaRanIVSht2U0196eoiOn62DPdn2G3u8Xb_PHePn6kM3TZWzGA_rYGCcoJcpKMZ4knTDOFVollJHSlBJUwRkUmktiBeVUEpAAQkqTcFY4BnSGrrZz1777GGzo86YKxta1bm03hJyoRFFMsBD7U7w35clIL__RVTf4dvx5ozgVBMNm981WGd-F4K3L175qtP_KMeSbYPK_wYz8Yjd0KBpb_uKfJOg3y9qDgA</recordid><startdate>20231130</startdate><enddate>20231130</enddate><creator>Azizi, Zahra</creator><creator>Vedelli, Jordan K H</creator><creator>Anand, Kanwaljeet J S</creator><general>Massachusetts Medical Society</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>0TZ</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>8AO</scope><scope>8C1</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AN0</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BEC</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>K0Y</scope><scope>LK8</scope><scope>M0R</scope><scope>M0T</scope><scope>M1P</scope><scope>M2M</scope><scope>M2O</scope><scope>M2P</scope><scope>M7P</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>Q9U</scope><scope>7X8</scope></search><sort><creationdate>20231130</creationdate><title>Informative Artifacts in AI-Assisted Care</title><author>Azizi, Zahra ; Vedelli, Jordan K H ; Anand, Kanwaljeet J S</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c525t-ccf63328e761157f6cffba89342dcd708b540ba572e6353720700677c954bf403</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Artificial Intelligence</topic><topic>Bias</topic><topic>Conflicts of interest</topic><topic>Health care</topic><topic>Health disparities</topic><topic>Humans</topic><topic>Minority &amp; ethnic groups</topic><topic>Patient safety</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Azizi, Zahra</creatorcontrib><creatorcontrib>Vedelli, Jordan K H</creatorcontrib><creatorcontrib>Anand, Kanwaljeet J S</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Pharma and Biotech Premium PRO</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>British Nursing Database</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>eLibrary</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>New England Journal of Medicine</collection><collection>ProQuest Biological Science Collection</collection><collection>Consumer Health Database</collection><collection>Healthcare Administration Database</collection><collection>Medical Database</collection><collection>Psychology Database</collection><collection>Research Library</collection><collection>Science Database</collection><collection>Biological Science Database</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>The New England journal of medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Azizi, Zahra</au><au>Vedelli, Jordan K H</au><au>Anand, Kanwaljeet J S</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Informative Artifacts in AI-Assisted Care</atitle><jtitle>The New England journal of medicine</jtitle><addtitle>N Engl J Med</addtitle><date>2023-11-30</date><risdate>2023</risdate><volume>389</volume><issue>22</issue><spage>2113</spage><epage>2115</epage><pages>2113-2115</pages><issn>0028-4793</issn><eissn>1533-4406</eissn><abstract>To the Editor: Ferryman et al. (Aug. 31 issue)1 acknowledge that the entire health care system suffers from the absence of data on race and ethnicity, particularly for underserved populations. Artificial intelligence (AI) applications that are trained on such health care data sets are inherently biased and likely to accentuate widening health inequities for underrepresented racial and ethnic groups.2,3 When algorithmic bias aligns with current manifestations of injustice, skewed AI tools will lead to greater inequity and discrimination.1-3 The proposal by Ferryman et al.1 that AI-generated patterns be considered as artifacts that provide insight into the societies and institutions that . . .</abstract><cop>United States</cop><pub>Massachusetts Medical Society</pub><pmid>38048204</pmid><doi>10.1056/NEJMc2311525</doi><tpages>3</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0028-4793
ispartof The New England journal of medicine, 2023-11, Vol.389 (22), p.2113-2115
issn 0028-4793
1533-4406
language eng
recordid cdi_proquest_miscellaneous_2898312166
source MEDLINE; EZB-FREE-00999 freely available EZB journals; ProQuest Central UK/Ireland; New England Journal of Medicine
subjects Artificial Intelligence
Bias
Conflicts of interest
Health care
Health disparities
Humans
Minority & ethnic groups
Patient safety
title Informative Artifacts in AI-Assisted Care
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T02%3A18%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Informative%20Artifacts%20in%20AI-Assisted%20Care&rft.jtitle=The%20New%20England%20journal%20of%20medicine&rft.au=Azizi,%20Zahra&rft.date=2023-11-30&rft.volume=389&rft.issue=22&rft.spage=2113&rft.epage=2115&rft.pages=2113-2115&rft.issn=0028-4793&rft.eissn=1533-4406&rft_id=info:doi/10.1056/NEJMc2311525&rft_dat=%3Cproquest_cross%3E2898312161%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2895362106&rft_id=info:pmid/38048204&rfr_iscdi=true