Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations

Artificial intelligence (AI) systems have increasingly achieved expert-level performance in medical imaging applications. However, there is growing concern that such AI systems may reflect and amplify human bias, and reduce the quality of their performance in historically under-served populations su...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nature medicine 2021-12, Vol.27 (12), p.2176-2182
Hauptverfasser: Seyyed-Kalantari, Laleh, Zhang, Haoran, McDermott, Matthew B. A., Chen, Irene Y., Ghassemi, Marzyeh
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 2182
container_issue 12
container_start_page 2176
container_title Nature medicine
container_volume 27
creator Seyyed-Kalantari, Laleh
Zhang, Haoran
McDermott, Matthew B. A.
Chen, Irene Y.
Ghassemi, Marzyeh
description Artificial intelligence (AI) systems have increasingly achieved expert-level performance in medical imaging applications. However, there is growing concern that such AI systems may reflect and amplify human bias, and reduce the quality of their performance in historically under-served populations such as female patients, Black patients, or patients of low socioeconomic status. Such biases are especially troubling in the context of underdiagnosis, whereby the AI algorithm would inaccurately label an individual with a disease as healthy, potentially delaying access to care. Here, we examine algorithmic underdiagnosis in chest X-ray pathology classification across three large chest X-ray datasets, as well as one multi-source dataset. We find that classifiers produced using state-of-the-art computer vision techniques consistently and selectively underdiagnosed under-served patient populations and that the underdiagnosis rate was higher for intersectional under-served subpopulations, for example, Hispanic female patients. Deployment of AI systems using medical imaging for disease diagnosis with such biases risks exacerbation of existing care biases and can potentially lead to unequal access to medical treatment, thereby raising ethical concerns for the use of these models in the clinic. Artificial intelligence algorithms trained using chest X-rays consistently underdiagnose pulmonary abnormalities or diseases in historically under-served patient populations, raising ethical concerns about the clinical use of such algorithms.
doi_str_mv 10.1038/s41591-021-01595-0
format Article
fullrecord <record><control><sourceid>gale_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_8674135</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A686972028</galeid><sourcerecordid>A686972028</sourcerecordid><originalsourceid>FETCH-LOGICAL-c744t-92688b54bff232ce100064af5c074b084e367b129f8990b7388cc58e2a38bdb3</originalsourceid><addsrcrecordid>eNqNku1r1TAUxosobk7_AT9IQRD90Jm3pskXYQxfBoOBTvFbSNO0zchNuqQd6l_vubtz25WLSAk9NL_naU6eUxTPMTrEiIq3meFa4goRWFDVFXpQ7OOa8Qo36PtDqFEjKiFrvlc8yfkCIURRLR8Xe5QJSZuG7xe_vobOps7pIcTsctk6ncvYlzrNrnfGaV-6MFvv3WCDsaX2Q0xuHle51NPkne3KOZZmtHkuk-5cHJKexgyiclk7V9mmK4AmPTsb5nKK0-KhjiE_LR712mf77OZ9UJx_eH9-_Kk6Pft4cnx0WpmGsbmShAvR1qzte0KJsRja4Ez3tUENa5FglvKmxUT2QkrUNlQIY2phiaai7Vp6ULzb2E5Lu7KdgVMk7dWU3Eqnnypqp7Z3ghvVEK-U4A3DtAaD1zcGKV4u0KhauWzgSnSwccmKcIwkZpJyQF_-hV7EJQXo7poilNdM3FGD9la50Ef4r1mbqiMuuGwIImuq2kFBChYOGYPtHXze4g938PB0duXMTsGbLQEws_0xD3rJWZ18-fz_7Nm3bfbVPXa02s9jjn65Dn0bJBvQpJhzsv1tKBip9YSrzYQrmHB1PeEKgejF_ThvJX9GGgC6ATJshcGmuwz-YfsbMhgFYg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2610236548</pqid></control><display><type>article</type><title>Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations</title><source>MEDLINE</source><source>Nature</source><source>Alma/SFX Local Collection</source><creator>Seyyed-Kalantari, Laleh ; Zhang, Haoran ; McDermott, Matthew B. A. ; Chen, Irene Y. ; Ghassemi, Marzyeh</creator><creatorcontrib>Seyyed-Kalantari, Laleh ; Zhang, Haoran ; McDermott, Matthew B. A. ; Chen, Irene Y. ; Ghassemi, Marzyeh</creatorcontrib><description>Artificial intelligence (AI) systems have increasingly achieved expert-level performance in medical imaging applications. However, there is growing concern that such AI systems may reflect and amplify human bias, and reduce the quality of their performance in historically under-served populations such as female patients, Black patients, or patients of low socioeconomic status. Such biases are especially troubling in the context of underdiagnosis, whereby the AI algorithm would inaccurately label an individual with a disease as healthy, potentially delaying access to care. Here, we examine algorithmic underdiagnosis in chest X-ray pathology classification across three large chest X-ray datasets, as well as one multi-source dataset. We find that classifiers produced using state-of-the-art computer vision techniques consistently and selectively underdiagnosed under-served patient populations and that the underdiagnosis rate was higher for intersectional under-served subpopulations, for example, Hispanic female patients. Deployment of AI systems using medical imaging for disease diagnosis with such biases risks exacerbation of existing care biases and can potentially lead to unequal access to medical treatment, thereby raising ethical concerns for the use of these models in the clinic. Artificial intelligence algorithms trained using chest X-rays consistently underdiagnose pulmonary abnormalities or diseases in historically under-served patient populations, raising ethical concerns about the clinical use of such algorithms.</description><identifier>ISSN: 1078-8956</identifier><identifier>EISSN: 1546-170X</identifier><identifier>DOI: 10.1038/s41591-021-01595-0</identifier><identifier>PMID: 34893776</identifier><language>eng</language><publisher>New York: Nature Publishing Group US</publisher><subject>631/114/1305 ; 692/700/1421 ; Abnormalities ; Adolescent ; Algorithms ; Artificial Intelligence ; Bias ; Biomedical and Life Sciences ; Biomedicine ; Cancer Research ; Chest ; Child ; Child, Preschool ; Computer vision ; Datasets ; Datasets as Topic ; Ethics ; Female ; Health care access ; Health services ; Human bias ; Humans ; Infant ; Infant, Newborn ; Infectious Diseases ; Lung diseases ; Male ; Medical imaging ; Medical imaging equipment ; Medical treatment ; Metabolic Diseases ; Molecular Medicine ; Neurosciences ; Patients ; Populations ; Radiography, Thoracic ; Socioeconomics ; Subpopulations ; Vulnerable Populations ; X-rays ; Young Adult</subject><ispartof>Nature medicine, 2021-12, Vol.27 (12), p.2176-2182</ispartof><rights>The Author(s) 2021</rights><rights>2021. The Author(s).</rights><rights>COPYRIGHT 2021 Nature Publishing Group</rights><rights>The Author(s) 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c744t-92688b54bff232ce100064af5c074b084e367b129f8990b7388cc58e2a38bdb3</citedby><cites>FETCH-LOGICAL-c744t-92688b54bff232ce100064af5c074b084e367b129f8990b7388cc58e2a38bdb3</cites><orcidid>0000-0001-6349-7251 ; 0000-0002-1059-7125</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>230,314,780,784,885,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/34893776$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Seyyed-Kalantari, Laleh</creatorcontrib><creatorcontrib>Zhang, Haoran</creatorcontrib><creatorcontrib>McDermott, Matthew B. A.</creatorcontrib><creatorcontrib>Chen, Irene Y.</creatorcontrib><creatorcontrib>Ghassemi, Marzyeh</creatorcontrib><title>Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations</title><title>Nature medicine</title><addtitle>Nat Med</addtitle><addtitle>Nat Med</addtitle><description>Artificial intelligence (AI) systems have increasingly achieved expert-level performance in medical imaging applications. However, there is growing concern that such AI systems may reflect and amplify human bias, and reduce the quality of their performance in historically under-served populations such as female patients, Black patients, or patients of low socioeconomic status. Such biases are especially troubling in the context of underdiagnosis, whereby the AI algorithm would inaccurately label an individual with a disease as healthy, potentially delaying access to care. Here, we examine algorithmic underdiagnosis in chest X-ray pathology classification across three large chest X-ray datasets, as well as one multi-source dataset. We find that classifiers produced using state-of-the-art computer vision techniques consistently and selectively underdiagnosed under-served patient populations and that the underdiagnosis rate was higher for intersectional under-served subpopulations, for example, Hispanic female patients. Deployment of AI systems using medical imaging for disease diagnosis with such biases risks exacerbation of existing care biases and can potentially lead to unequal access to medical treatment, thereby raising ethical concerns for the use of these models in the clinic. Artificial intelligence algorithms trained using chest X-rays consistently underdiagnose pulmonary abnormalities or diseases in historically under-served patient populations, raising ethical concerns about the clinical use of such algorithms.</description><subject>631/114/1305</subject><subject>692/700/1421</subject><subject>Abnormalities</subject><subject>Adolescent</subject><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Bias</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedicine</subject><subject>Cancer Research</subject><subject>Chest</subject><subject>Child</subject><subject>Child, Preschool</subject><subject>Computer vision</subject><subject>Datasets</subject><subject>Datasets as Topic</subject><subject>Ethics</subject><subject>Female</subject><subject>Health care access</subject><subject>Health services</subject><subject>Human bias</subject><subject>Humans</subject><subject>Infant</subject><subject>Infant, Newborn</subject><subject>Infectious Diseases</subject><subject>Lung diseases</subject><subject>Male</subject><subject>Medical imaging</subject><subject>Medical imaging equipment</subject><subject>Medical treatment</subject><subject>Metabolic Diseases</subject><subject>Molecular Medicine</subject><subject>Neurosciences</subject><subject>Patients</subject><subject>Populations</subject><subject>Radiography, Thoracic</subject><subject>Socioeconomics</subject><subject>Subpopulations</subject><subject>Vulnerable Populations</subject><subject>X-rays</subject><subject>Young Adult</subject><issn>1078-8956</issn><issn>1546-170X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>C6C</sourceid><sourceid>EIF</sourceid><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqNku1r1TAUxosobk7_AT9IQRD90Jm3pskXYQxfBoOBTvFbSNO0zchNuqQd6l_vubtz25WLSAk9NL_naU6eUxTPMTrEiIq3meFa4goRWFDVFXpQ7OOa8Qo36PtDqFEjKiFrvlc8yfkCIURRLR8Xe5QJSZuG7xe_vobOps7pIcTsctk6ncvYlzrNrnfGaV-6MFvv3WCDsaX2Q0xuHle51NPkne3KOZZmtHkuk-5cHJKexgyiclk7V9mmK4AmPTsb5nKK0-KhjiE_LR712mf77OZ9UJx_eH9-_Kk6Pft4cnx0WpmGsbmShAvR1qzte0KJsRja4Ez3tUENa5FglvKmxUT2QkrUNlQIY2phiaai7Vp6ULzb2E5Lu7KdgVMk7dWU3Eqnnypqp7Z3ghvVEK-U4A3DtAaD1zcGKV4u0KhauWzgSnSwccmKcIwkZpJyQF_-hV7EJQXo7poilNdM3FGD9la50Ef4r1mbqiMuuGwIImuq2kFBChYOGYPtHXze4g938PB0duXMTsGbLQEws_0xD3rJWZ18-fz_7Nm3bfbVPXa02s9jjn65Dn0bJBvQpJhzsv1tKBip9YSrzYQrmHB1PeEKgejF_ThvJX9GGgC6ATJshcGmuwz-YfsbMhgFYg</recordid><startdate>20211201</startdate><enddate>20211201</enddate><creator>Seyyed-Kalantari, Laleh</creator><creator>Zhang, Haoran</creator><creator>McDermott, Matthew B. A.</creator><creator>Chen, Irene Y.</creator><creator>Ghassemi, Marzyeh</creator><general>Nature Publishing Group US</general><general>Nature Publishing Group</general><scope>C6C</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QP</scope><scope>7QR</scope><scope>7T5</scope><scope>7TK</scope><scope>7TM</scope><scope>7TO</scope><scope>7U7</scope><scope>7U9</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88E</scope><scope>88I</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M2P</scope><scope>M7N</scope><scope>M7P</scope><scope>MBDVC</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0001-6349-7251</orcidid><orcidid>https://orcid.org/0000-0002-1059-7125</orcidid></search><sort><creationdate>20211201</creationdate><title>Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations</title><author>Seyyed-Kalantari, Laleh ; Zhang, Haoran ; McDermott, Matthew B. A. ; Chen, Irene Y. ; Ghassemi, Marzyeh</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c744t-92688b54bff232ce100064af5c074b084e367b129f8990b7388cc58e2a38bdb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>631/114/1305</topic><topic>692/700/1421</topic><topic>Abnormalities</topic><topic>Adolescent</topic><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Bias</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedicine</topic><topic>Cancer Research</topic><topic>Chest</topic><topic>Child</topic><topic>Child, Preschool</topic><topic>Computer vision</topic><topic>Datasets</topic><topic>Datasets as Topic</topic><topic>Ethics</topic><topic>Female</topic><topic>Health care access</topic><topic>Health services</topic><topic>Human bias</topic><topic>Humans</topic><topic>Infant</topic><topic>Infant, Newborn</topic><topic>Infectious Diseases</topic><topic>Lung diseases</topic><topic>Male</topic><topic>Medical imaging</topic><topic>Medical imaging equipment</topic><topic>Medical treatment</topic><topic>Metabolic Diseases</topic><topic>Molecular Medicine</topic><topic>Neurosciences</topic><topic>Patients</topic><topic>Populations</topic><topic>Radiography, Thoracic</topic><topic>Socioeconomics</topic><topic>Subpopulations</topic><topic>Vulnerable Populations</topic><topic>X-rays</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Seyyed-Kalantari, Laleh</creatorcontrib><creatorcontrib>Zhang, Haoran</creatorcontrib><creatorcontrib>McDermott, Matthew B. A.</creatorcontrib><creatorcontrib>Chen, Irene Y.</creatorcontrib><creatorcontrib>Ghassemi, Marzyeh</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Immunology Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Oncogenes and Growth Factors Abstracts</collection><collection>Toxicology Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Research Library</collection><collection>Science Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Research Library (Corporate)</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Nature medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Seyyed-Kalantari, Laleh</au><au>Zhang, Haoran</au><au>McDermott, Matthew B. A.</au><au>Chen, Irene Y.</au><au>Ghassemi, Marzyeh</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations</atitle><jtitle>Nature medicine</jtitle><stitle>Nat Med</stitle><addtitle>Nat Med</addtitle><date>2021-12-01</date><risdate>2021</risdate><volume>27</volume><issue>12</issue><spage>2176</spage><epage>2182</epage><pages>2176-2182</pages><issn>1078-8956</issn><eissn>1546-170X</eissn><abstract>Artificial intelligence (AI) systems have increasingly achieved expert-level performance in medical imaging applications. However, there is growing concern that such AI systems may reflect and amplify human bias, and reduce the quality of their performance in historically under-served populations such as female patients, Black patients, or patients of low socioeconomic status. Such biases are especially troubling in the context of underdiagnosis, whereby the AI algorithm would inaccurately label an individual with a disease as healthy, potentially delaying access to care. Here, we examine algorithmic underdiagnosis in chest X-ray pathology classification across three large chest X-ray datasets, as well as one multi-source dataset. We find that classifiers produced using state-of-the-art computer vision techniques consistently and selectively underdiagnosed under-served patient populations and that the underdiagnosis rate was higher for intersectional under-served subpopulations, for example, Hispanic female patients. Deployment of AI systems using medical imaging for disease diagnosis with such biases risks exacerbation of existing care biases and can potentially lead to unequal access to medical treatment, thereby raising ethical concerns for the use of these models in the clinic. Artificial intelligence algorithms trained using chest X-rays consistently underdiagnose pulmonary abnormalities or diseases in historically under-served patient populations, raising ethical concerns about the clinical use of such algorithms.</abstract><cop>New York</cop><pub>Nature Publishing Group US</pub><pmid>34893776</pmid><doi>10.1038/s41591-021-01595-0</doi><tpages>7</tpages><orcidid>https://orcid.org/0000-0001-6349-7251</orcidid><orcidid>https://orcid.org/0000-0002-1059-7125</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1078-8956
ispartof Nature medicine, 2021-12, Vol.27 (12), p.2176-2182
issn 1078-8956
1546-170X
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_8674135
source MEDLINE; Nature; Alma/SFX Local Collection
subjects 631/114/1305
692/700/1421
Abnormalities
Adolescent
Algorithms
Artificial Intelligence
Bias
Biomedical and Life Sciences
Biomedicine
Cancer Research
Chest
Child
Child, Preschool
Computer vision
Datasets
Datasets as Topic
Ethics
Female
Health care access
Health services
Human bias
Humans
Infant
Infant, Newborn
Infectious Diseases
Lung diseases
Male
Medical imaging
Medical imaging equipment
Medical treatment
Metabolic Diseases
Molecular Medicine
Neurosciences
Patients
Populations
Radiography, Thoracic
Socioeconomics
Subpopulations
Vulnerable Populations
X-rays
Young Adult
title Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T02%3A04%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Underdiagnosis%20bias%20of%20artificial%20intelligence%20algorithms%20applied%20to%20chest%20radiographs%20in%20under-served%20patient%20populations&rft.jtitle=Nature%20medicine&rft.au=Seyyed-Kalantari,%20Laleh&rft.date=2021-12-01&rft.volume=27&rft.issue=12&rft.spage=2176&rft.epage=2182&rft.pages=2176-2182&rft.issn=1078-8956&rft.eissn=1546-170X&rft_id=info:doi/10.1038/s41591-021-01595-0&rft_dat=%3Cgale_pubme%3EA686972028%3C/gale_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2610236548&rft_id=info:pmid/34893776&rft_galeid=A686972028&rfr_iscdi=true