Transformers and large language models in healthcare: A review

With Artificial Intelligence (AI) increasingly permeating various aspects of society, including healthcare, the adoption of the Transformers neural network architecture is rapidly changing many applications. Transformer is a type of deep learning architecture initially developed to solve general-pur...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Artificial intelligence in medicine 2024-08, Vol.154, p.102900, Article 102900
Hauptverfasser: Nerella, Subhash, Bandyopadhyay, Sabyasachi, Zhang, Jiaqing, Contreras, Miguel, Siegel, Scott, Bumin, Aysegul, Silva, Brandon, Sena, Jessica, Shickel, Benjamin, Bihorac, Azra, Khezeli, Kia, Rashidi, Parisa
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page 102900
container_title Artificial intelligence in medicine
container_volume 154
creator Nerella, Subhash
Bandyopadhyay, Sabyasachi
Zhang, Jiaqing
Contreras, Miguel
Siegel, Scott
Bumin, Aysegul
Silva, Brandon
Sena, Jessica
Shickel, Benjamin
Bihorac, Azra
Khezeli, Kia
Rashidi, Parisa
description With Artificial Intelligence (AI) increasingly permeating various aspects of society, including healthcare, the adoption of the Transformers neural network architecture is rapidly changing many applications. Transformer is a type of deep learning architecture initially developed to solve general-purpose Natural Language Processing (NLP) tasks and has subsequently been adapted in many fields, including healthcare. In this survey paper, we provide an overview of how this architecture has been adopted to analyze various forms of healthcare data, including clinical NLP, medical imaging, structured Electronic Health Records (EHR), social media, bio-physiological signals, biomolecular sequences. Furthermore, which have also include the articles that used the transformer architecture for generating surgical instructions and predicting adverse outcomes after surgeries under the umbrella of critical care. Under diverse settings, these models have been used for clinical diagnosis, report generation, data reconstruction, and drug/protein synthesis. Finally, we also discuss the benefits and limitations of using transformers in healthcare and examine issues such as computational cost, model interpretability, fairness, alignment with human values, ethical implications, and environmental impact. •Transformers in clinical NLP, Electronic Health Records, and social media data•Transformers in medical imaging (image segmentation, registration, captioning, synthesis)•Transformers for analyzing bio-signals (human activity, EEG, ECG) and biomolecular sequence.•Detailed explanation of basic Transformer Architecture and some popular variants•Discussion on computational costs and the necessity of AI alignment
doi_str_mv 10.1016/j.artmed.2024.102900
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_3068752131</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0933365724001428</els_id><sourcerecordid>3068752131</sourcerecordid><originalsourceid>FETCH-LOGICAL-c357t-ed364845025952d56e406bc19bafe4330fbf0370066266787f1a5c48ddccfe193</originalsourceid><addsrcrecordid>eNp9kMlOwzAQhi0EoqXwBgjlyCVlHMdLOFRCFZtUiUs5W649aVNlKXYC4u1JlPbKZWY088_2EXJLYU6Biof93Pi2QjdPIEn7VJIBnJEpVZLFiRJwTqaQMRYzweWEXIWwBwCZUnFJJkwpqTjnU7JYe1OHvPEV-hCZ2kWl8Vvsbb3tTB9UjcMyREUd7dCU7c4aj4_RU-Txu8Cfa3KRmzLgzdHPyOfL83r5Fq8-Xt-XT6vYMi7bGB0TqUo5JDzjieMCUxAbS7ONyTFlDPJNDkwCCJEIIZXMqeE2Vc5ZmyPN2Izcj3MPvvnqMLS6KoLFsj8Tmy5oBkJJnlBGe2k6Sq1vQvCY64MvKuN_NQU9kNN7PZLTAzk9kuvb7o4bus1QOzWdUPWCxSjocQy_ex1sgbVFV3i0rXZN8f-GP63qf44</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3068752131</pqid></control><display><type>article</type><title>Transformers and large language models in healthcare: A review</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><creator>Nerella, Subhash ; Bandyopadhyay, Sabyasachi ; Zhang, Jiaqing ; Contreras, Miguel ; Siegel, Scott ; Bumin, Aysegul ; Silva, Brandon ; Sena, Jessica ; Shickel, Benjamin ; Bihorac, Azra ; Khezeli, Kia ; Rashidi, Parisa</creator><creatorcontrib>Nerella, Subhash ; Bandyopadhyay, Sabyasachi ; Zhang, Jiaqing ; Contreras, Miguel ; Siegel, Scott ; Bumin, Aysegul ; Silva, Brandon ; Sena, Jessica ; Shickel, Benjamin ; Bihorac, Azra ; Khezeli, Kia ; Rashidi, Parisa</creatorcontrib><description>With Artificial Intelligence (AI) increasingly permeating various aspects of society, including healthcare, the adoption of the Transformers neural network architecture is rapidly changing many applications. Transformer is a type of deep learning architecture initially developed to solve general-purpose Natural Language Processing (NLP) tasks and has subsequently been adapted in many fields, including healthcare. In this survey paper, we provide an overview of how this architecture has been adopted to analyze various forms of healthcare data, including clinical NLP, medical imaging, structured Electronic Health Records (EHR), social media, bio-physiological signals, biomolecular sequences. Furthermore, which have also include the articles that used the transformer architecture for generating surgical instructions and predicting adverse outcomes after surgeries under the umbrella of critical care. Under diverse settings, these models have been used for clinical diagnosis, report generation, data reconstruction, and drug/protein synthesis. Finally, we also discuss the benefits and limitations of using transformers in healthcare and examine issues such as computational cost, model interpretability, fairness, alignment with human values, ethical implications, and environmental impact. •Transformers in clinical NLP, Electronic Health Records, and social media data•Transformers in medical imaging (image segmentation, registration, captioning, synthesis)•Transformers for analyzing bio-signals (human activity, EEG, ECG) and biomolecular sequence.•Detailed explanation of basic Transformer Architecture and some popular variants•Discussion on computational costs and the necessity of AI alignment</description><identifier>ISSN: 0933-3657</identifier><identifier>ISSN: 1873-2860</identifier><identifier>EISSN: 1873-2860</identifier><identifier>DOI: 10.1016/j.artmed.2024.102900</identifier><identifier>PMID: 38878555</identifier><language>eng</language><publisher>Netherlands: Elsevier B.V</publisher><subject>Artificial Intelligence ; Deep Learning ; Delivery of Health Care - organization &amp; administration ; Electronic Health Records ; Healthcare ; Humans ; Large Language Models ; Medical Imaging ; Natural Language Processing ; Neural Networks, Computer ; Transformers</subject><ispartof>Artificial intelligence in medicine, 2024-08, Vol.154, p.102900, Article 102900</ispartof><rights>2024 The Authors</rights><rights>Copyright © 2024 The Authors. Published by Elsevier B.V. All rights reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c357t-ed364845025952d56e406bc19bafe4330fbf0370066266787f1a5c48ddccfe193</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0933365724001428$$EHTML$$P50$$Gelsevier$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38878555$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Nerella, Subhash</creatorcontrib><creatorcontrib>Bandyopadhyay, Sabyasachi</creatorcontrib><creatorcontrib>Zhang, Jiaqing</creatorcontrib><creatorcontrib>Contreras, Miguel</creatorcontrib><creatorcontrib>Siegel, Scott</creatorcontrib><creatorcontrib>Bumin, Aysegul</creatorcontrib><creatorcontrib>Silva, Brandon</creatorcontrib><creatorcontrib>Sena, Jessica</creatorcontrib><creatorcontrib>Shickel, Benjamin</creatorcontrib><creatorcontrib>Bihorac, Azra</creatorcontrib><creatorcontrib>Khezeli, Kia</creatorcontrib><creatorcontrib>Rashidi, Parisa</creatorcontrib><title>Transformers and large language models in healthcare: A review</title><title>Artificial intelligence in medicine</title><addtitle>Artif Intell Med</addtitle><description>With Artificial Intelligence (AI) increasingly permeating various aspects of society, including healthcare, the adoption of the Transformers neural network architecture is rapidly changing many applications. Transformer is a type of deep learning architecture initially developed to solve general-purpose Natural Language Processing (NLP) tasks and has subsequently been adapted in many fields, including healthcare. In this survey paper, we provide an overview of how this architecture has been adopted to analyze various forms of healthcare data, including clinical NLP, medical imaging, structured Electronic Health Records (EHR), social media, bio-physiological signals, biomolecular sequences. Furthermore, which have also include the articles that used the transformer architecture for generating surgical instructions and predicting adverse outcomes after surgeries under the umbrella of critical care. Under diverse settings, these models have been used for clinical diagnosis, report generation, data reconstruction, and drug/protein synthesis. Finally, we also discuss the benefits and limitations of using transformers in healthcare and examine issues such as computational cost, model interpretability, fairness, alignment with human values, ethical implications, and environmental impact. •Transformers in clinical NLP, Electronic Health Records, and social media data•Transformers in medical imaging (image segmentation, registration, captioning, synthesis)•Transformers for analyzing bio-signals (human activity, EEG, ECG) and biomolecular sequence.•Detailed explanation of basic Transformer Architecture and some popular variants•Discussion on computational costs and the necessity of AI alignment</description><subject>Artificial Intelligence</subject><subject>Deep Learning</subject><subject>Delivery of Health Care - organization &amp; administration</subject><subject>Electronic Health Records</subject><subject>Healthcare</subject><subject>Humans</subject><subject>Large Language Models</subject><subject>Medical Imaging</subject><subject>Natural Language Processing</subject><subject>Neural Networks, Computer</subject><subject>Transformers</subject><issn>0933-3657</issn><issn>1873-2860</issn><issn>1873-2860</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kMlOwzAQhi0EoqXwBgjlyCVlHMdLOFRCFZtUiUs5W649aVNlKXYC4u1JlPbKZWY088_2EXJLYU6Biof93Pi2QjdPIEn7VJIBnJEpVZLFiRJwTqaQMRYzweWEXIWwBwCZUnFJJkwpqTjnU7JYe1OHvPEV-hCZ2kWl8Vvsbb3tTB9UjcMyREUd7dCU7c4aj4_RU-Txu8Cfa3KRmzLgzdHPyOfL83r5Fq8-Xt-XT6vYMi7bGB0TqUo5JDzjieMCUxAbS7ONyTFlDPJNDkwCCJEIIZXMqeE2Vc5ZmyPN2Izcj3MPvvnqMLS6KoLFsj8Tmy5oBkJJnlBGe2k6Sq1vQvCY64MvKuN_NQU9kNN7PZLTAzk9kuvb7o4bus1QOzWdUPWCxSjocQy_ex1sgbVFV3i0rXZN8f-GP63qf44</recordid><startdate>202408</startdate><enddate>202408</enddate><creator>Nerella, Subhash</creator><creator>Bandyopadhyay, Sabyasachi</creator><creator>Zhang, Jiaqing</creator><creator>Contreras, Miguel</creator><creator>Siegel, Scott</creator><creator>Bumin, Aysegul</creator><creator>Silva, Brandon</creator><creator>Sena, Jessica</creator><creator>Shickel, Benjamin</creator><creator>Bihorac, Azra</creator><creator>Khezeli, Kia</creator><creator>Rashidi, Parisa</creator><general>Elsevier B.V</general><scope>6I.</scope><scope>AAFTH</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>202408</creationdate><title>Transformers and large language models in healthcare: A review</title><author>Nerella, Subhash ; Bandyopadhyay, Sabyasachi ; Zhang, Jiaqing ; Contreras, Miguel ; Siegel, Scott ; Bumin, Aysegul ; Silva, Brandon ; Sena, Jessica ; Shickel, Benjamin ; Bihorac, Azra ; Khezeli, Kia ; Rashidi, Parisa</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c357t-ed364845025952d56e406bc19bafe4330fbf0370066266787f1a5c48ddccfe193</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial Intelligence</topic><topic>Deep Learning</topic><topic>Delivery of Health Care - organization &amp; administration</topic><topic>Electronic Health Records</topic><topic>Healthcare</topic><topic>Humans</topic><topic>Large Language Models</topic><topic>Medical Imaging</topic><topic>Natural Language Processing</topic><topic>Neural Networks, Computer</topic><topic>Transformers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Nerella, Subhash</creatorcontrib><creatorcontrib>Bandyopadhyay, Sabyasachi</creatorcontrib><creatorcontrib>Zhang, Jiaqing</creatorcontrib><creatorcontrib>Contreras, Miguel</creatorcontrib><creatorcontrib>Siegel, Scott</creatorcontrib><creatorcontrib>Bumin, Aysegul</creatorcontrib><creatorcontrib>Silva, Brandon</creatorcontrib><creatorcontrib>Sena, Jessica</creatorcontrib><creatorcontrib>Shickel, Benjamin</creatorcontrib><creatorcontrib>Bihorac, Azra</creatorcontrib><creatorcontrib>Khezeli, Kia</creatorcontrib><creatorcontrib>Rashidi, Parisa</creatorcontrib><collection>ScienceDirect Open Access Titles</collection><collection>Elsevier:ScienceDirect:Open Access</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Artificial intelligence in medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nerella, Subhash</au><au>Bandyopadhyay, Sabyasachi</au><au>Zhang, Jiaqing</au><au>Contreras, Miguel</au><au>Siegel, Scott</au><au>Bumin, Aysegul</au><au>Silva, Brandon</au><au>Sena, Jessica</au><au>Shickel, Benjamin</au><au>Bihorac, Azra</au><au>Khezeli, Kia</au><au>Rashidi, Parisa</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Transformers and large language models in healthcare: A review</atitle><jtitle>Artificial intelligence in medicine</jtitle><addtitle>Artif Intell Med</addtitle><date>2024-08</date><risdate>2024</risdate><volume>154</volume><spage>102900</spage><pages>102900-</pages><artnum>102900</artnum><issn>0933-3657</issn><issn>1873-2860</issn><eissn>1873-2860</eissn><abstract>With Artificial Intelligence (AI) increasingly permeating various aspects of society, including healthcare, the adoption of the Transformers neural network architecture is rapidly changing many applications. Transformer is a type of deep learning architecture initially developed to solve general-purpose Natural Language Processing (NLP) tasks and has subsequently been adapted in many fields, including healthcare. In this survey paper, we provide an overview of how this architecture has been adopted to analyze various forms of healthcare data, including clinical NLP, medical imaging, structured Electronic Health Records (EHR), social media, bio-physiological signals, biomolecular sequences. Furthermore, which have also include the articles that used the transformer architecture for generating surgical instructions and predicting adverse outcomes after surgeries under the umbrella of critical care. Under diverse settings, these models have been used for clinical diagnosis, report generation, data reconstruction, and drug/protein synthesis. Finally, we also discuss the benefits and limitations of using transformers in healthcare and examine issues such as computational cost, model interpretability, fairness, alignment with human values, ethical implications, and environmental impact. •Transformers in clinical NLP, Electronic Health Records, and social media data•Transformers in medical imaging (image segmentation, registration, captioning, synthesis)•Transformers for analyzing bio-signals (human activity, EEG, ECG) and biomolecular sequence.•Detailed explanation of basic Transformer Architecture and some popular variants•Discussion on computational costs and the necessity of AI alignment</abstract><cop>Netherlands</cop><pub>Elsevier B.V</pub><pmid>38878555</pmid><doi>10.1016/j.artmed.2024.102900</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0933-3657
ispartof Artificial intelligence in medicine, 2024-08, Vol.154, p.102900, Article 102900
issn 0933-3657
1873-2860
1873-2860
language eng
recordid cdi_proquest_miscellaneous_3068752131
source MEDLINE; Elsevier ScienceDirect Journals
subjects Artificial Intelligence
Deep Learning
Delivery of Health Care - organization & administration
Electronic Health Records
Healthcare
Humans
Large Language Models
Medical Imaging
Natural Language Processing
Neural Networks, Computer
Transformers
title Transformers and large language models in healthcare: A review
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T16%3A38%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Transformers%20and%20large%20language%20models%20in%20healthcare:%20A%20review&rft.jtitle=Artificial%20intelligence%20in%20medicine&rft.au=Nerella,%20Subhash&rft.date=2024-08&rft.volume=154&rft.spage=102900&rft.pages=102900-&rft.artnum=102900&rft.issn=0933-3657&rft.eissn=1873-2860&rft_id=info:doi/10.1016/j.artmed.2024.102900&rft_dat=%3Cproquest_cross%3E3068752131%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3068752131&rft_id=info:pmid/38878555&rft_els_id=S0933365724001428&rfr_iscdi=true