Patient contrastive learning: A performant, expressive, and practical approach to electrocardiogram modeling

Supervised machine learning applications in health care are often limited due to a scarcity of labeled training data. To mitigate the effect of small sample size, we introduce a pre-training approach, Patient Contrastive Learning of Representations (PCLR), which creates latent representations of ele...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PLoS computational biology 2022-02, Vol.18 (2), p.e1009862-e1009862
Hauptverfasser: Diamant, Nathaniel, Reinertsen, Erik, Song, Steven, Aguirre, Aaron D, Stultz, Collin M, Batra, Puneet
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page e1009862
container_issue 2
container_start_page e1009862
container_title PLoS computational biology
container_volume 18
creator Diamant, Nathaniel
Reinertsen, Erik
Song, Steven
Aguirre, Aaron D
Stultz, Collin M
Batra, Puneet
description Supervised machine learning applications in health care are often limited due to a scarcity of labeled training data. To mitigate the effect of small sample size, we introduce a pre-training approach, Patient Contrastive Learning of Representations (PCLR), which creates latent representations of electrocardiograms (ECGs) from a large number of unlabeled examples using contrastive learning. The resulting representations are expressive, performant, and practical across a wide spectrum of clinical tasks. We develop PCLR using a large health care system with over 3.2 million 12-lead ECGs and demonstrate that training linear models on PCLR representations achieves a 51% performance increase, on average, over six training set sizes and four tasks (sex classification, age regression, and the detection of left ventricular hypertrophy and atrial fibrillation), relative to training neural network models from scratch. We also compared PCLR to three other ECG pre-training approaches (supervised pre-training, unsupervised pre-training with an autoencoder, and pre-training using a contrastive multi ECG-segment approach), and show significant performance benefits in three out of four tasks. We found an average performance benefit of 47% over the other models and an average of a 9% performance benefit compared to best model for each task. We release PCLR to enable others to extract ECG representations at https://github.com/broadinstitute/ml4h/tree/master/model_zoo/PCLR.
doi_str_mv 10.1371/journal.pcbi.1009862
format Article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2640120063</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A695461870</galeid><doaj_id>oai_doaj_org_article_13763c90f5eb49f6a8689c2b0fbaa7f4</doaj_id><sourcerecordid>A695461870</sourcerecordid><originalsourceid>FETCH-LOGICAL-c661t-5aac0175c86c4ad6e33634e75ba4ead2947becb61dd6745a090519d0ef4e6b853</originalsourceid><addsrcrecordid>eNqVkktv1DAUhSMEoqXwDxBEYgNSZ7Dj2ElYII0qHiNVgHisrRv7JvXIiVM7U5V_j8OkVYO6QVnEcr577HNykuQ5JWvKCvp25_a-B7seVG3WlJCqFNmD5JhyzlYF4-XDO-uj5EkIO0LishKPkyPGKS9ExY8T-w1Gg_2YKtePHsJorjC1CL43ffsu3aQD-sb5DvrxNMXrwWMIETlNodfp4EGNRoFNYRi8A3WRji5Fi2r0ToHXxrUeurRzGm3Ue5o8asAGfDa_T5JfHz_8PPu8Ov_6aXu2OV8pIei44gCK0IKrUqgctEDGBMux4DXkCDqr8qJGVQuqtShyDqQinFaaYJOjqEvOTpKXB93BuiDnoILMRE5oRohgkdgeCO1gJwdvOvC_pQMj_24430rw0ZpFGcMWTFWk4VjnVSOgFGWlspo0NUDR5FHr_Xzavu5QK5yCtAvR5ZfeXMjWXcmyLEnFaBR4PQt4d7nHMMrOBIXWQo9uP907iw5jNGVEX_2D3u9uplqIBkzfuHiumkTlJv71PCoVJFLre6j4aOxMrAM2Ju4vBt4sBqbK4PXYwj4Euf3x_T_YL0s2P7DKuxA8NrfZUTLFT29Myqnrcu56HHtxN_fboZtysz_2mfvR</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2640120063</pqid></control><display><type>article</type><title>Patient contrastive learning: A performant, expressive, and practical approach to electrocardiogram modeling</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central</source><source>Public Library of Science (PLoS)</source><creator>Diamant, Nathaniel ; Reinertsen, Erik ; Song, Steven ; Aguirre, Aaron D ; Stultz, Collin M ; Batra, Puneet</creator><creatorcontrib>Diamant, Nathaniel ; Reinertsen, Erik ; Song, Steven ; Aguirre, Aaron D ; Stultz, Collin M ; Batra, Puneet</creatorcontrib><description>Supervised machine learning applications in health care are often limited due to a scarcity of labeled training data. To mitigate the effect of small sample size, we introduce a pre-training approach, Patient Contrastive Learning of Representations (PCLR), which creates latent representations of electrocardiograms (ECGs) from a large number of unlabeled examples using contrastive learning. The resulting representations are expressive, performant, and practical across a wide spectrum of clinical tasks. We develop PCLR using a large health care system with over 3.2 million 12-lead ECGs and demonstrate that training linear models on PCLR representations achieves a 51% performance increase, on average, over six training set sizes and four tasks (sex classification, age regression, and the detection of left ventricular hypertrophy and atrial fibrillation), relative to training neural network models from scratch. We also compared PCLR to three other ECG pre-training approaches (supervised pre-training, unsupervised pre-training with an autoencoder, and pre-training using a contrastive multi ECG-segment approach), and show significant performance benefits in three out of four tasks. We found an average performance benefit of 47% over the other models and an average of a 9% performance benefit compared to best model for each task. We release PCLR to enable others to extract ECG representations at https://github.com/broadinstitute/ml4h/tree/master/model_zoo/PCLR.</description><identifier>ISSN: 1553-7358</identifier><identifier>ISSN: 1553-734X</identifier><identifier>EISSN: 1553-7358</identifier><identifier>DOI: 10.1371/journal.pcbi.1009862</identifier><identifier>PMID: 35157695</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Analysis ; Atrial Fibrillation ; Datasets ; Deep learning ; EKG ; Electrocardiogram ; Electrocardiography ; Fibrillation ; Health care ; Heart ; Humans ; Hypertrophy ; Machine learning ; Neural networks ; Neural Networks, Computer ; Patients ; Representations ; Supervised Machine Learning ; Training ; Ventricle</subject><ispartof>PLoS computational biology, 2022-02, Vol.18 (2), p.e1009862-e1009862</ispartof><rights>COPYRIGHT 2022 Public Library of Science</rights><rights>2022 Diamant et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2022 Diamant et al 2022 Diamant et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c661t-5aac0175c86c4ad6e33634e75ba4ead2947becb61dd6745a090519d0ef4e6b853</citedby><cites>FETCH-LOGICAL-c661t-5aac0175c86c4ad6e33634e75ba4ead2947becb61dd6745a090519d0ef4e6b853</cites><orcidid>0000-0001-7214-9611 ; 0000-0001-6822-0593 ; 0000-0002-1738-304X ; 0000-0002-5509-1646 ; 0000-0001-5868-3877</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8880931/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8880931/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,2096,2915,23845,27901,27902,53766,53768,79342,79343</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35157695$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Diamant, Nathaniel</creatorcontrib><creatorcontrib>Reinertsen, Erik</creatorcontrib><creatorcontrib>Song, Steven</creatorcontrib><creatorcontrib>Aguirre, Aaron D</creatorcontrib><creatorcontrib>Stultz, Collin M</creatorcontrib><creatorcontrib>Batra, Puneet</creatorcontrib><title>Patient contrastive learning: A performant, expressive, and practical approach to electrocardiogram modeling</title><title>PLoS computational biology</title><addtitle>PLoS Comput Biol</addtitle><description>Supervised machine learning applications in health care are often limited due to a scarcity of labeled training data. To mitigate the effect of small sample size, we introduce a pre-training approach, Patient Contrastive Learning of Representations (PCLR), which creates latent representations of electrocardiograms (ECGs) from a large number of unlabeled examples using contrastive learning. The resulting representations are expressive, performant, and practical across a wide spectrum of clinical tasks. We develop PCLR using a large health care system with over 3.2 million 12-lead ECGs and demonstrate that training linear models on PCLR representations achieves a 51% performance increase, on average, over six training set sizes and four tasks (sex classification, age regression, and the detection of left ventricular hypertrophy and atrial fibrillation), relative to training neural network models from scratch. We also compared PCLR to three other ECG pre-training approaches (supervised pre-training, unsupervised pre-training with an autoencoder, and pre-training using a contrastive multi ECG-segment approach), and show significant performance benefits in three out of four tasks. We found an average performance benefit of 47% over the other models and an average of a 9% performance benefit compared to best model for each task. We release PCLR to enable others to extract ECG representations at https://github.com/broadinstitute/ml4h/tree/master/model_zoo/PCLR.</description><subject>Analysis</subject><subject>Atrial Fibrillation</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>EKG</subject><subject>Electrocardiogram</subject><subject>Electrocardiography</subject><subject>Fibrillation</subject><subject>Health care</subject><subject>Heart</subject><subject>Humans</subject><subject>Hypertrophy</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Patients</subject><subject>Representations</subject><subject>Supervised Machine Learning</subject><subject>Training</subject><subject>Ventricle</subject><issn>1553-7358</issn><issn>1553-734X</issn><issn>1553-7358</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>BENPR</sourceid><sourceid>DOA</sourceid><recordid>eNqVkktv1DAUhSMEoqXwDxBEYgNSZ7Dj2ElYII0qHiNVgHisrRv7JvXIiVM7U5V_j8OkVYO6QVnEcr577HNykuQ5JWvKCvp25_a-B7seVG3WlJCqFNmD5JhyzlYF4-XDO-uj5EkIO0LishKPkyPGKS9ExY8T-w1Gg_2YKtePHsJorjC1CL43ffsu3aQD-sb5DvrxNMXrwWMIETlNodfp4EGNRoFNYRi8A3WRji5Fi2r0ToHXxrUeurRzGm3Ue5o8asAGfDa_T5JfHz_8PPu8Ov_6aXu2OV8pIei44gCK0IKrUqgctEDGBMux4DXkCDqr8qJGVQuqtShyDqQinFaaYJOjqEvOTpKXB93BuiDnoILMRE5oRohgkdgeCO1gJwdvOvC_pQMj_24430rw0ZpFGcMWTFWk4VjnVSOgFGWlspo0NUDR5FHr_Xzavu5QK5yCtAvR5ZfeXMjWXcmyLEnFaBR4PQt4d7nHMMrOBIXWQo9uP907iw5jNGVEX_2D3u9uplqIBkzfuHiumkTlJv71PCoVJFLre6j4aOxMrAM2Ju4vBt4sBqbK4PXYwj4Euf3x_T_YL0s2P7DKuxA8NrfZUTLFT29Myqnrcu56HHtxN_fboZtysz_2mfvR</recordid><startdate>20220201</startdate><enddate>20220201</enddate><creator>Diamant, Nathaniel</creator><creator>Reinertsen, Erik</creator><creator>Song, Steven</creator><creator>Aguirre, Aaron D</creator><creator>Stultz, Collin M</creator><creator>Batra, Puneet</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>ISN</scope><scope>ISR</scope><scope>3V.</scope><scope>7QO</scope><scope>7QP</scope><scope>7TK</scope><scope>7TM</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-7214-9611</orcidid><orcidid>https://orcid.org/0000-0001-6822-0593</orcidid><orcidid>https://orcid.org/0000-0002-1738-304X</orcidid><orcidid>https://orcid.org/0000-0002-5509-1646</orcidid><orcidid>https://orcid.org/0000-0001-5868-3877</orcidid></search><sort><creationdate>20220201</creationdate><title>Patient contrastive learning: A performant, expressive, and practical approach to electrocardiogram modeling</title><author>Diamant, Nathaniel ; Reinertsen, Erik ; Song, Steven ; Aguirre, Aaron D ; Stultz, Collin M ; Batra, Puneet</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c661t-5aac0175c86c4ad6e33634e75ba4ead2947becb61dd6745a090519d0ef4e6b853</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Analysis</topic><topic>Atrial Fibrillation</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>EKG</topic><topic>Electrocardiogram</topic><topic>Electrocardiography</topic><topic>Fibrillation</topic><topic>Health care</topic><topic>Heart</topic><topic>Humans</topic><topic>Hypertrophy</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Patients</topic><topic>Representations</topic><topic>Supervised Machine Learning</topic><topic>Training</topic><topic>Ventricle</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Diamant, Nathaniel</creatorcontrib><creatorcontrib>Reinertsen, Erik</creatorcontrib><creatorcontrib>Song, Steven</creatorcontrib><creatorcontrib>Aguirre, Aaron D</creatorcontrib><creatorcontrib>Stultz, Collin M</creatorcontrib><creatorcontrib>Batra, Puneet</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Canada</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PLoS computational biology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Diamant, Nathaniel</au><au>Reinertsen, Erik</au><au>Song, Steven</au><au>Aguirre, Aaron D</au><au>Stultz, Collin M</au><au>Batra, Puneet</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Patient contrastive learning: A performant, expressive, and practical approach to electrocardiogram modeling</atitle><jtitle>PLoS computational biology</jtitle><addtitle>PLoS Comput Biol</addtitle><date>2022-02-01</date><risdate>2022</risdate><volume>18</volume><issue>2</issue><spage>e1009862</spage><epage>e1009862</epage><pages>e1009862-e1009862</pages><issn>1553-7358</issn><issn>1553-734X</issn><eissn>1553-7358</eissn><abstract>Supervised machine learning applications in health care are often limited due to a scarcity of labeled training data. To mitigate the effect of small sample size, we introduce a pre-training approach, Patient Contrastive Learning of Representations (PCLR), which creates latent representations of electrocardiograms (ECGs) from a large number of unlabeled examples using contrastive learning. The resulting representations are expressive, performant, and practical across a wide spectrum of clinical tasks. We develop PCLR using a large health care system with over 3.2 million 12-lead ECGs and demonstrate that training linear models on PCLR representations achieves a 51% performance increase, on average, over six training set sizes and four tasks (sex classification, age regression, and the detection of left ventricular hypertrophy and atrial fibrillation), relative to training neural network models from scratch. We also compared PCLR to three other ECG pre-training approaches (supervised pre-training, unsupervised pre-training with an autoencoder, and pre-training using a contrastive multi ECG-segment approach), and show significant performance benefits in three out of four tasks. We found an average performance benefit of 47% over the other models and an average of a 9% performance benefit compared to best model for each task. We release PCLR to enable others to extract ECG representations at https://github.com/broadinstitute/ml4h/tree/master/model_zoo/PCLR.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>35157695</pmid><doi>10.1371/journal.pcbi.1009862</doi><orcidid>https://orcid.org/0000-0001-7214-9611</orcidid><orcidid>https://orcid.org/0000-0001-6822-0593</orcidid><orcidid>https://orcid.org/0000-0002-1738-304X</orcidid><orcidid>https://orcid.org/0000-0002-5509-1646</orcidid><orcidid>https://orcid.org/0000-0001-5868-3877</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1553-7358
ispartof PLoS computational biology, 2022-02, Vol.18 (2), p.e1009862-e1009862
issn 1553-7358
1553-734X
1553-7358
language eng
recordid cdi_plos_journals_2640120063
source MEDLINE; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central; Public Library of Science (PLoS)
subjects Analysis
Atrial Fibrillation
Datasets
Deep learning
EKG
Electrocardiogram
Electrocardiography
Fibrillation
Health care
Heart
Humans
Hypertrophy
Machine learning
Neural networks
Neural Networks, Computer
Patients
Representations
Supervised Machine Learning
Training
Ventricle
title Patient contrastive learning: A performant, expressive, and practical approach to electrocardiogram modeling
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T22%3A42%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Patient%20contrastive%20learning:%20A%20performant,%20expressive,%20and%20practical%20approach%20to%20electrocardiogram%20modeling&rft.jtitle=PLoS%20computational%20biology&rft.au=Diamant,%20Nathaniel&rft.date=2022-02-01&rft.volume=18&rft.issue=2&rft.spage=e1009862&rft.epage=e1009862&rft.pages=e1009862-e1009862&rft.issn=1553-7358&rft.eissn=1553-7358&rft_id=info:doi/10.1371/journal.pcbi.1009862&rft_dat=%3Cgale_plos_%3EA695461870%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2640120063&rft_id=info:pmid/35157695&rft_galeid=A695461870&rft_doaj_id=oai_doaj_org_article_13763c90f5eb49f6a8689c2b0fbaa7f4&rfr_iscdi=true