Designing medical artificial intelligence for in- and out-groups
Medical artificial intelligence (AI) is expected to deliver worldwide access to healthcare. Through three experimental studies with Chinese and American participants, we tested how the design of medical AI varies between in- and out-groups. Participants adopted the role of a medical AI designer and...
Gespeichert in:
Veröffentlicht in: | Computers in human behavior 2021-11, Vol.124, p.106929, Article 106929 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | 106929 |
container_title | Computers in human behavior |
container_volume | 124 |
creator | Li, Wanyue Zhou, Xinyue Yang, Qian |
description | Medical artificial intelligence (AI) is expected to deliver worldwide access to healthcare. Through three experimental studies with Chinese and American participants, we tested how the design of medical AI varies between in- and out-groups. Participants adopted the role of a medical AI designer and decided how to develop medical AI for in- or out-groups based on their experimental condition. Studies 1 (pre-registered: N = 191) revealed that Chinese participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from US (i.e., out-groups) than for patients from China (i.e., in-groups). Study 2 (N = 190) revealed that US participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from China (i.e., out-groups) than for patients from US (i.e., in-groups). Study 3 revealed that Chinese medical students (N = 160) selected a smaller training database for AI when diagnosing diabetic retinopathy among US patients (i.e., out-groups) than for Chinese patients (i.e., in-groups), and this effect was stronger among medical students from higher (vs. lower) socioeconomic backgrounds. This AI design inequity was mediated by individuals’ underestimation of out-group heterogeneity. Overall, our evidence suggests that out-group stereotype shapes the design of medical AI, unwittingly undermining healthcare quality. The current findings underline the need for more robust data on medical AI development and intervention research addressing healthcare inequity.
•Medical artificial intelligence (AI) can deliver worldwide access to healthcare.•In three studies, we addressed how designing medical AI varies between in- and out-groups.•We examined how non-medical and medical people varies in designing medical AI for in- and out-groups.•Out-group stereotype shapes the design of medical AI.•This health inequity has implications for AI stakeholders and health researchers. |
doi_str_mv | 10.1016/j.chb.2021.106929 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2568030720</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0747563221002521</els_id><sourcerecordid>2568030720</sourcerecordid><originalsourceid>FETCH-LOGICAL-c255t-1a0902ed0c687536ba6a96ed01fc1081df84c149d9ee7cc748a334204925bfb13</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMouH78AG8Fz1knaZO0eFHWT1jwoueQptOa0m3WpBX892apZ08z7zDvfDyEXDFYM2Dypl_bz3rNgbOkZcWrI7JipcqpSuKYrEAVigqZ81NyFmMPAEKAXJG7B4yuG93YZTtsnDVDZsLkWmddSt044TC4DkeLWetDKtDMjE3m54l2wc_7eEFOWjNEvPyL5-Tj6fF980K3b8-vm_sttVyIiTIDFXBswMpSiVzWRppKJs1ay6BkTVsWlhVVUyEqa1VRmjwvOBQVF3Vbs_ycXC9z98F_zRgn3fs5jGml5kKWkIPikLrY0mWDjzFgq_fB7Uz40Qz0AZTudQKlD6D0Aip5bhcPpvO_HQYdrTt83LiAdtKNd_-4fwFi1m8h</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2568030720</pqid></control><display><type>article</type><title>Designing medical artificial intelligence for in- and out-groups</title><source>Elsevier ScienceDirect Journals</source><creator>Li, Wanyue ; Zhou, Xinyue ; Yang, Qian</creator><creatorcontrib>Li, Wanyue ; Zhou, Xinyue ; Yang, Qian</creatorcontrib><description>Medical artificial intelligence (AI) is expected to deliver worldwide access to healthcare. Through three experimental studies with Chinese and American participants, we tested how the design of medical AI varies between in- and out-groups. Participants adopted the role of a medical AI designer and decided how to develop medical AI for in- or out-groups based on their experimental condition. Studies 1 (pre-registered: N = 191) revealed that Chinese participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from US (i.e., out-groups) than for patients from China (i.e., in-groups). Study 2 (N = 190) revealed that US participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from China (i.e., out-groups) than for patients from US (i.e., in-groups). Study 3 revealed that Chinese medical students (N = 160) selected a smaller training database for AI when diagnosing diabetic retinopathy among US patients (i.e., out-groups) than for Chinese patients (i.e., in-groups), and this effect was stronger among medical students from higher (vs. lower) socioeconomic backgrounds. This AI design inequity was mediated by individuals’ underestimation of out-group heterogeneity. Overall, our evidence suggests that out-group stereotype shapes the design of medical AI, unwittingly undermining healthcare quality. The current findings underline the need for more robust data on medical AI development and intervention research addressing healthcare inequity.
•Medical artificial intelligence (AI) can deliver worldwide access to healthcare.•In three studies, we addressed how designing medical AI varies between in- and out-groups.•We examined how non-medical and medical people varies in designing medical AI for in- and out-groups.•Out-group stereotype shapes the design of medical AI.•This health inequity has implications for AI stakeholders and health researchers.</description><identifier>ISSN: 0747-5632</identifier><identifier>EISSN: 1873-7692</identifier><identifier>DOI: 10.1016/j.chb.2021.106929</identifier><language>eng</language><publisher>Elmsford: Elsevier Ltd</publisher><subject>Artificial intelligence ; Design ; Diabetic retinopathy ; Experiment ; Health care ; Health inequity ; Heterogeneity ; Medical artificial intelligence design ; Medical research ; Medical students ; Out-group homogeneity effect ; Patients ; Physicians ; Students</subject><ispartof>Computers in human behavior, 2021-11, Vol.124, p.106929, Article 106929</ispartof><rights>2021 Elsevier Ltd</rights><rights>Copyright Elsevier Science Ltd. Nov 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c255t-1a0902ed0c687536ba6a96ed01fc1081df84c149d9ee7cc748a334204925bfb13</citedby><cites>FETCH-LOGICAL-c255t-1a0902ed0c687536ba6a96ed01fc1081df84c149d9ee7cc748a334204925bfb13</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0747563221002521$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3536,27903,27904,65309</link.rule.ids></links><search><creatorcontrib>Li, Wanyue</creatorcontrib><creatorcontrib>Zhou, Xinyue</creatorcontrib><creatorcontrib>Yang, Qian</creatorcontrib><title>Designing medical artificial intelligence for in- and out-groups</title><title>Computers in human behavior</title><description>Medical artificial intelligence (AI) is expected to deliver worldwide access to healthcare. Through three experimental studies with Chinese and American participants, we tested how the design of medical AI varies between in- and out-groups. Participants adopted the role of a medical AI designer and decided how to develop medical AI for in- or out-groups based on their experimental condition. Studies 1 (pre-registered: N = 191) revealed that Chinese participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from US (i.e., out-groups) than for patients from China (i.e., in-groups). Study 2 (N = 190) revealed that US participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from China (i.e., out-groups) than for patients from US (i.e., in-groups). Study 3 revealed that Chinese medical students (N = 160) selected a smaller training database for AI when diagnosing diabetic retinopathy among US patients (i.e., out-groups) than for Chinese patients (i.e., in-groups), and this effect was stronger among medical students from higher (vs. lower) socioeconomic backgrounds. This AI design inequity was mediated by individuals’ underestimation of out-group heterogeneity. Overall, our evidence suggests that out-group stereotype shapes the design of medical AI, unwittingly undermining healthcare quality. The current findings underline the need for more robust data on medical AI development and intervention research addressing healthcare inequity.
•Medical artificial intelligence (AI) can deliver worldwide access to healthcare.•In three studies, we addressed how designing medical AI varies between in- and out-groups.•We examined how non-medical and medical people varies in designing medical AI for in- and out-groups.•Out-group stereotype shapes the design of medical AI.•This health inequity has implications for AI stakeholders and health researchers.</description><subject>Artificial intelligence</subject><subject>Design</subject><subject>Diabetic retinopathy</subject><subject>Experiment</subject><subject>Health care</subject><subject>Health inequity</subject><subject>Heterogeneity</subject><subject>Medical artificial intelligence design</subject><subject>Medical research</subject><subject>Medical students</subject><subject>Out-group homogeneity effect</subject><subject>Patients</subject><subject>Physicians</subject><subject>Students</subject><issn>0747-5632</issn><issn>1873-7692</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LxDAQhoMouH78AG8Fz1knaZO0eFHWT1jwoueQptOa0m3WpBX892apZ08z7zDvfDyEXDFYM2Dypl_bz3rNgbOkZcWrI7JipcqpSuKYrEAVigqZ81NyFmMPAEKAXJG7B4yuG93YZTtsnDVDZsLkWmddSt044TC4DkeLWetDKtDMjE3m54l2wc_7eEFOWjNEvPyL5-Tj6fF980K3b8-vm_sttVyIiTIDFXBswMpSiVzWRppKJs1ay6BkTVsWlhVVUyEqa1VRmjwvOBQVF3Vbs_ycXC9z98F_zRgn3fs5jGml5kKWkIPikLrY0mWDjzFgq_fB7Uz40Qz0AZTudQKlD6D0Aip5bhcPpvO_HQYdrTt83LiAdtKNd_-4fwFi1m8h</recordid><startdate>202111</startdate><enddate>202111</enddate><creator>Li, Wanyue</creator><creator>Zhou, Xinyue</creator><creator>Yang, Qian</creator><general>Elsevier Ltd</general><general>Elsevier Science Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>202111</creationdate><title>Designing medical artificial intelligence for in- and out-groups</title><author>Li, Wanyue ; Zhou, Xinyue ; Yang, Qian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c255t-1a0902ed0c687536ba6a96ed01fc1081df84c149d9ee7cc748a334204925bfb13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Artificial intelligence</topic><topic>Design</topic><topic>Diabetic retinopathy</topic><topic>Experiment</topic><topic>Health care</topic><topic>Health inequity</topic><topic>Heterogeneity</topic><topic>Medical artificial intelligence design</topic><topic>Medical research</topic><topic>Medical students</topic><topic>Out-group homogeneity effect</topic><topic>Patients</topic><topic>Physicians</topic><topic>Students</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Wanyue</creatorcontrib><creatorcontrib>Zhou, Xinyue</creatorcontrib><creatorcontrib>Yang, Qian</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Computers in human behavior</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Wanyue</au><au>Zhou, Xinyue</au><au>Yang, Qian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Designing medical artificial intelligence for in- and out-groups</atitle><jtitle>Computers in human behavior</jtitle><date>2021-11</date><risdate>2021</risdate><volume>124</volume><spage>106929</spage><pages>106929-</pages><artnum>106929</artnum><issn>0747-5632</issn><eissn>1873-7692</eissn><abstract>Medical artificial intelligence (AI) is expected to deliver worldwide access to healthcare. Through three experimental studies with Chinese and American participants, we tested how the design of medical AI varies between in- and out-groups. Participants adopted the role of a medical AI designer and decided how to develop medical AI for in- or out-groups based on their experimental condition. Studies 1 (pre-registered: N = 191) revealed that Chinese participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from US (i.e., out-groups) than for patients from China (i.e., in-groups). Study 2 (N = 190) revealed that US participants were less likely to adopt human doctors' assistance in medical AI system when targeting patients from China (i.e., out-groups) than for patients from US (i.e., in-groups). Study 3 revealed that Chinese medical students (N = 160) selected a smaller training database for AI when diagnosing diabetic retinopathy among US patients (i.e., out-groups) than for Chinese patients (i.e., in-groups), and this effect was stronger among medical students from higher (vs. lower) socioeconomic backgrounds. This AI design inequity was mediated by individuals’ underestimation of out-group heterogeneity. Overall, our evidence suggests that out-group stereotype shapes the design of medical AI, unwittingly undermining healthcare quality. The current findings underline the need for more robust data on medical AI development and intervention research addressing healthcare inequity.
•Medical artificial intelligence (AI) can deliver worldwide access to healthcare.•In three studies, we addressed how designing medical AI varies between in- and out-groups.•We examined how non-medical and medical people varies in designing medical AI for in- and out-groups.•Out-group stereotype shapes the design of medical AI.•This health inequity has implications for AI stakeholders and health researchers.</abstract><cop>Elmsford</cop><pub>Elsevier Ltd</pub><doi>10.1016/j.chb.2021.106929</doi></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0747-5632 |
ispartof | Computers in human behavior, 2021-11, Vol.124, p.106929, Article 106929 |
issn | 0747-5632 1873-7692 |
language | eng |
recordid | cdi_proquest_journals_2568030720 |
source | Elsevier ScienceDirect Journals |
subjects | Artificial intelligence Design Diabetic retinopathy Experiment Health care Health inequity Heterogeneity Medical artificial intelligence design Medical research Medical students Out-group homogeneity effect Patients Physicians Students |
title | Designing medical artificial intelligence for in- and out-groups |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T16%3A29%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Designing%20medical%20artificial%20intelligence%20for%20in-%20and%20out-groups&rft.jtitle=Computers%20in%20human%20behavior&rft.au=Li,%20Wanyue&rft.date=2021-11&rft.volume=124&rft.spage=106929&rft.pages=106929-&rft.artnum=106929&rft.issn=0747-5632&rft.eissn=1873-7692&rft_id=info:doi/10.1016/j.chb.2021.106929&rft_dat=%3Cproquest_cross%3E2568030720%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2568030720&rft_id=info:pmid/&rft_els_id=S0747563221002521&rfr_iscdi=true |