Multi-region radiomics for artificially intelligent diagnosis of breast cancer using multimodal ultrasound

The ultrasound (US) diagnosis of breast cancer is usually based on a single-region of a whole breast tumor from a single ultrasonic modality, which limits the diagnostic performance. Multiple regions on multimodal US images of breast tumors may all have useful information for diagnosis. This study a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers in biology and medicine 2022-10, Vol.149, p.105920-105920, Article 105920
Hauptverfasser: Xu, Zhou, Wang, Yuqun, Chen, Man, Zhang, Qi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 105920
container_issue
container_start_page 105920
container_title Computers in biology and medicine
container_volume 149
creator Xu, Zhou
Wang, Yuqun
Chen, Man
Zhang, Qi
description The ultrasound (US) diagnosis of breast cancer is usually based on a single-region of a whole breast tumor from a single ultrasonic modality, which limits the diagnostic performance. Multiple regions on multimodal US images of breast tumors may all have useful information for diagnosis. This study aimed to propose a multi-region radiomics approach with multimodal US for artificially intelligent diagnosis of malignant and benign breast tumors. Firstly, radiomics features were extracted from five regions of interest (ROIs) on B-mode US and contrast-enhanced ultrasound (CEUS) images, including intensity statistics, gray-level co-occurrence matrix texture features and binary texture features. The multiple ROIs included the whole tumor region, strongest perfusion region, marginal region and surrounding region. Secondly, a deep neural network, composed of the point-wise gated Boltzmann machine and the restricted Boltzmann machine, was adopted to comprehensively learn and select features. Thirdly, the support vector machine was used for classification between benign and malignant breast tumors. Finally, five single-region classification models were generated from five ROIs, and they were fused to form an integrated classification model. Experimental evaluation was conducted on multimodal US images of breast from 187 patients with breast tumors (68 malignant and 119 benign). Under five-fold cross-validation, the classification accuracy, sensitivity, specificity, Youden's index and area under the receiver operating characteristic curve (AUC) with our model were 87.1% ± 3.3%, 77.4% ± 11.8%, 92.4% ± 7.2%, 69.8% ± 8.6% and 0.849 ± 0.043, respectively. Our model was significantly better than single-region single-modal methods in terms of the AUC and accuracy (p 
doi_str_mv 10.1016/j.compbiomed.2022.105920
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2704868744</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S001048252200662X</els_id><sourcerecordid>2704868744</sourcerecordid><originalsourceid>FETCH-LOGICAL-c309t-45ecd7d04b6728ad130796fc5e65138bf3a592c8e9c10646c47d3d7168457c863</originalsourceid><addsrcrecordid>eNqFkUurFDEQhYMoOF79DwE3bnqsvNNLvfiCK250HTJJeqimOxmTbuH-ezOMILhxVUXx1aFOHUIogyMDpt_Ox1DWywnLmuKRA-d9rEYOT8iBWTMOoIR8Sg4ADAZpuXpOXrQ2A4AEAQcyf92XDYeazlgyrT52IQyNTqVSXzecMKBflkeKeUvLgueUNxrRn3Np2GiZ6Kkm3zYafA6p0r1hPtP1KrqW6Bfau-pb2XN8SZ5Nfmnp1Z96R358_PD9_vPw8O3Tl_t3D0MQMG6DVClEE0GetOHWRybAjHoKKmnFhD1Nwnd_waYxMNBSB2miiIZpK5UJVos78uame6nl557a5lZsoR_vcyp7c9yAtNoaKTv6-h90LnvN_bpOMcWFssA7ZW9UqKW1miZ3qbj6-ugYuGsIbnZ_Q3DXENwthL76_raauuFfmKprAVP_VMSawuZiwf-L_AZatpZO</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2715235802</pqid></control><display><type>article</type><title>Multi-region radiomics for artificially intelligent diagnosis of breast cancer using multimodal ultrasound</title><source>Elsevier ScienceDirect Journals Complete</source><source>ProQuest Central UK/Ireland</source><creator>Xu, Zhou ; Wang, Yuqun ; Chen, Man ; Zhang, Qi</creator><creatorcontrib>Xu, Zhou ; Wang, Yuqun ; Chen, Man ; Zhang, Qi</creatorcontrib><description>The ultrasound (US) diagnosis of breast cancer is usually based on a single-region of a whole breast tumor from a single ultrasonic modality, which limits the diagnostic performance. Multiple regions on multimodal US images of breast tumors may all have useful information for diagnosis. This study aimed to propose a multi-region radiomics approach with multimodal US for artificially intelligent diagnosis of malignant and benign breast tumors. Firstly, radiomics features were extracted from five regions of interest (ROIs) on B-mode US and contrast-enhanced ultrasound (CEUS) images, including intensity statistics, gray-level co-occurrence matrix texture features and binary texture features. The multiple ROIs included the whole tumor region, strongest perfusion region, marginal region and surrounding region. Secondly, a deep neural network, composed of the point-wise gated Boltzmann machine and the restricted Boltzmann machine, was adopted to comprehensively learn and select features. Thirdly, the support vector machine was used for classification between benign and malignant breast tumors. Finally, five single-region classification models were generated from five ROIs, and they were fused to form an integrated classification model. Experimental evaluation was conducted on multimodal US images of breast from 187 patients with breast tumors (68 malignant and 119 benign). Under five-fold cross-validation, the classification accuracy, sensitivity, specificity, Youden's index and area under the receiver operating characteristic curve (AUC) with our model were 87.1% ± 3.3%, 77.4% ± 11.8%, 92.4% ± 7.2%, 69.8% ± 8.6% and 0.849 ± 0.043, respectively. Our model was significantly better than single-region single-modal methods in terms of the AUC and accuracy (p &lt; 0.05). In addition to the whole tumor region, the other regions including the strongest perfusion region, marginal region and surrounding region on US images can assist breast cancer diagnosis. The multi-region multimodal radiomics model achieved the best classification results. Our artificially intelligent model would be potentially useful for clinical diagnosis of breast cancer. •We propose an AI-based diagnosis system with multi-region multimodal radiomics to diagnose breast cancer.•Our model is significantly better than single-region single-modal methods in terms of AUC and accuracy.•Combining multi-regional radiomics features from multimodal images of breast tumors improves diagnosis accuracy.</description><identifier>ISSN: 0010-4825</identifier><identifier>EISSN: 1879-0534</identifier><identifier>DOI: 10.1016/j.compbiomed.2022.105920</identifier><language>eng</language><publisher>Oxford: Elsevier Ltd</publisher><subject>Accuracy ; Artificial neural networks ; Breast cancer ; Classification ; Computer-aided diagnosis (CAD) ; Deep learning ; Diagnosis ; Feature extraction ; Image classification ; Image contrast ; Image enhancement ; Mammography ; Medical diagnosis ; Medical imaging ; Multimodal ; Neural networks ; Perfusion ; Point-wise gated deep network (PGDN) ; Radiomics ; Regions ; Support vector machines ; Texture ; Tumors ; Ultrasonic imaging ; Ultrasound ; Womens health</subject><ispartof>Computers in biology and medicine, 2022-10, Vol.149, p.105920-105920, Article 105920</ispartof><rights>2022 Elsevier Ltd</rights><rights>2022. Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c309t-45ecd7d04b6728ad130796fc5e65138bf3a592c8e9c10646c47d3d7168457c863</citedby><cites>FETCH-LOGICAL-c309t-45ecd7d04b6728ad130796fc5e65138bf3a592c8e9c10646c47d3d7168457c863</cites><orcidid>0000-0001-7041-643X ; 0000-0003-1188-9362 ; 0000-0001-6834-3331</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2715235802?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995,64385,64387,64389,72469</link.rule.ids></links><search><creatorcontrib>Xu, Zhou</creatorcontrib><creatorcontrib>Wang, Yuqun</creatorcontrib><creatorcontrib>Chen, Man</creatorcontrib><creatorcontrib>Zhang, Qi</creatorcontrib><title>Multi-region radiomics for artificially intelligent diagnosis of breast cancer using multimodal ultrasound</title><title>Computers in biology and medicine</title><description>The ultrasound (US) diagnosis of breast cancer is usually based on a single-region of a whole breast tumor from a single ultrasonic modality, which limits the diagnostic performance. Multiple regions on multimodal US images of breast tumors may all have useful information for diagnosis. This study aimed to propose a multi-region radiomics approach with multimodal US for artificially intelligent diagnosis of malignant and benign breast tumors. Firstly, radiomics features were extracted from five regions of interest (ROIs) on B-mode US and contrast-enhanced ultrasound (CEUS) images, including intensity statistics, gray-level co-occurrence matrix texture features and binary texture features. The multiple ROIs included the whole tumor region, strongest perfusion region, marginal region and surrounding region. Secondly, a deep neural network, composed of the point-wise gated Boltzmann machine and the restricted Boltzmann machine, was adopted to comprehensively learn and select features. Thirdly, the support vector machine was used for classification between benign and malignant breast tumors. Finally, five single-region classification models were generated from five ROIs, and they were fused to form an integrated classification model. Experimental evaluation was conducted on multimodal US images of breast from 187 patients with breast tumors (68 malignant and 119 benign). Under five-fold cross-validation, the classification accuracy, sensitivity, specificity, Youden's index and area under the receiver operating characteristic curve (AUC) with our model were 87.1% ± 3.3%, 77.4% ± 11.8%, 92.4% ± 7.2%, 69.8% ± 8.6% and 0.849 ± 0.043, respectively. Our model was significantly better than single-region single-modal methods in terms of the AUC and accuracy (p &lt; 0.05). In addition to the whole tumor region, the other regions including the strongest perfusion region, marginal region and surrounding region on US images can assist breast cancer diagnosis. The multi-region multimodal radiomics model achieved the best classification results. Our artificially intelligent model would be potentially useful for clinical diagnosis of breast cancer. •We propose an AI-based diagnosis system with multi-region multimodal radiomics to diagnose breast cancer.•Our model is significantly better than single-region single-modal methods in terms of AUC and accuracy.•Combining multi-regional radiomics features from multimodal images of breast tumors improves diagnosis accuracy.</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Breast cancer</subject><subject>Classification</subject><subject>Computer-aided diagnosis (CAD)</subject><subject>Deep learning</subject><subject>Diagnosis</subject><subject>Feature extraction</subject><subject>Image classification</subject><subject>Image contrast</subject><subject>Image enhancement</subject><subject>Mammography</subject><subject>Medical diagnosis</subject><subject>Medical imaging</subject><subject>Multimodal</subject><subject>Neural networks</subject><subject>Perfusion</subject><subject>Point-wise gated deep network (PGDN)</subject><subject>Radiomics</subject><subject>Regions</subject><subject>Support vector machines</subject><subject>Texture</subject><subject>Tumors</subject><subject>Ultrasonic imaging</subject><subject>Ultrasound</subject><subject>Womens health</subject><issn>0010-4825</issn><issn>1879-0534</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqFkUurFDEQhYMoOF79DwE3bnqsvNNLvfiCK250HTJJeqimOxmTbuH-ezOMILhxVUXx1aFOHUIogyMDpt_Ox1DWywnLmuKRA-d9rEYOT8iBWTMOoIR8Sg4ADAZpuXpOXrQ2A4AEAQcyf92XDYeazlgyrT52IQyNTqVSXzecMKBflkeKeUvLgueUNxrRn3Np2GiZ6Kkm3zYafA6p0r1hPtP1KrqW6Bfau-pb2XN8SZ5Nfmnp1Z96R358_PD9_vPw8O3Tl_t3D0MQMG6DVClEE0GetOHWRybAjHoKKmnFhD1Nwnd_waYxMNBSB2miiIZpK5UJVos78uame6nl557a5lZsoR_vcyp7c9yAtNoaKTv6-h90LnvN_bpOMcWFssA7ZW9UqKW1miZ3qbj6-ugYuGsIbnZ_Q3DXENwthL76_raauuFfmKprAVP_VMSawuZiwf-L_AZatpZO</recordid><startdate>202210</startdate><enddate>202210</enddate><creator>Xu, Zhou</creator><creator>Wang, Yuqun</creator><creator>Chen, Man</creator><creator>Zhang, Qi</creator><general>Elsevier Ltd</general><general>Elsevier Limited</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M7P</scope><scope>M7Z</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-7041-643X</orcidid><orcidid>https://orcid.org/0000-0003-1188-9362</orcidid><orcidid>https://orcid.org/0000-0001-6834-3331</orcidid></search><sort><creationdate>202210</creationdate><title>Multi-region radiomics for artificially intelligent diagnosis of breast cancer using multimodal ultrasound</title><author>Xu, Zhou ; Wang, Yuqun ; Chen, Man ; Zhang, Qi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c309t-45ecd7d04b6728ad130796fc5e65138bf3a592c8e9c10646c47d3d7168457c863</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Breast cancer</topic><topic>Classification</topic><topic>Computer-aided diagnosis (CAD)</topic><topic>Deep learning</topic><topic>Diagnosis</topic><topic>Feature extraction</topic><topic>Image classification</topic><topic>Image contrast</topic><topic>Image enhancement</topic><topic>Mammography</topic><topic>Medical diagnosis</topic><topic>Medical imaging</topic><topic>Multimodal</topic><topic>Neural networks</topic><topic>Perfusion</topic><topic>Point-wise gated deep network (PGDN)</topic><topic>Radiomics</topic><topic>Regions</topic><topic>Support vector machines</topic><topic>Texture</topic><topic>Tumors</topic><topic>Ultrasonic imaging</topic><topic>Ultrasound</topic><topic>Womens health</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Xu, Zhou</creatorcontrib><creatorcontrib>Wang, Yuqun</creatorcontrib><creatorcontrib>Chen, Man</creatorcontrib><creatorcontrib>Zhang, Qi</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Research Library</collection><collection>Biological Science Database</collection><collection>Biochemistry Abstracts 1</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Computers in biology and medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Xu, Zhou</au><au>Wang, Yuqun</au><au>Chen, Man</au><au>Zhang, Qi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multi-region radiomics for artificially intelligent diagnosis of breast cancer using multimodal ultrasound</atitle><jtitle>Computers in biology and medicine</jtitle><date>2022-10</date><risdate>2022</risdate><volume>149</volume><spage>105920</spage><epage>105920</epage><pages>105920-105920</pages><artnum>105920</artnum><issn>0010-4825</issn><eissn>1879-0534</eissn><abstract>The ultrasound (US) diagnosis of breast cancer is usually based on a single-region of a whole breast tumor from a single ultrasonic modality, which limits the diagnostic performance. Multiple regions on multimodal US images of breast tumors may all have useful information for diagnosis. This study aimed to propose a multi-region radiomics approach with multimodal US for artificially intelligent diagnosis of malignant and benign breast tumors. Firstly, radiomics features were extracted from five regions of interest (ROIs) on B-mode US and contrast-enhanced ultrasound (CEUS) images, including intensity statistics, gray-level co-occurrence matrix texture features and binary texture features. The multiple ROIs included the whole tumor region, strongest perfusion region, marginal region and surrounding region. Secondly, a deep neural network, composed of the point-wise gated Boltzmann machine and the restricted Boltzmann machine, was adopted to comprehensively learn and select features. Thirdly, the support vector machine was used for classification between benign and malignant breast tumors. Finally, five single-region classification models were generated from five ROIs, and they were fused to form an integrated classification model. Experimental evaluation was conducted on multimodal US images of breast from 187 patients with breast tumors (68 malignant and 119 benign). Under five-fold cross-validation, the classification accuracy, sensitivity, specificity, Youden's index and area under the receiver operating characteristic curve (AUC) with our model were 87.1% ± 3.3%, 77.4% ± 11.8%, 92.4% ± 7.2%, 69.8% ± 8.6% and 0.849 ± 0.043, respectively. Our model was significantly better than single-region single-modal methods in terms of the AUC and accuracy (p &lt; 0.05). In addition to the whole tumor region, the other regions including the strongest perfusion region, marginal region and surrounding region on US images can assist breast cancer diagnosis. The multi-region multimodal radiomics model achieved the best classification results. Our artificially intelligent model would be potentially useful for clinical diagnosis of breast cancer. •We propose an AI-based diagnosis system with multi-region multimodal radiomics to diagnose breast cancer.•Our model is significantly better than single-region single-modal methods in terms of AUC and accuracy.•Combining multi-regional radiomics features from multimodal images of breast tumors improves diagnosis accuracy.</abstract><cop>Oxford</cop><pub>Elsevier Ltd</pub><doi>10.1016/j.compbiomed.2022.105920</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0001-7041-643X</orcidid><orcidid>https://orcid.org/0000-0003-1188-9362</orcidid><orcidid>https://orcid.org/0000-0001-6834-3331</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0010-4825
ispartof Computers in biology and medicine, 2022-10, Vol.149, p.105920-105920, Article 105920
issn 0010-4825
1879-0534
language eng
recordid cdi_proquest_miscellaneous_2704868744
source Elsevier ScienceDirect Journals Complete; ProQuest Central UK/Ireland
subjects Accuracy
Artificial neural networks
Breast cancer
Classification
Computer-aided diagnosis (CAD)
Deep learning
Diagnosis
Feature extraction
Image classification
Image contrast
Image enhancement
Mammography
Medical diagnosis
Medical imaging
Multimodal
Neural networks
Perfusion
Point-wise gated deep network (PGDN)
Radiomics
Regions
Support vector machines
Texture
Tumors
Ultrasonic imaging
Ultrasound
Womens health
title Multi-region radiomics for artificially intelligent diagnosis of breast cancer using multimodal ultrasound
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T06%3A29%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multi-region%20radiomics%20for%20artificially%20intelligent%20diagnosis%20of%20breast%20cancer%20using%20multimodal%20ultrasound&rft.jtitle=Computers%20in%20biology%20and%20medicine&rft.au=Xu,%20Zhou&rft.date=2022-10&rft.volume=149&rft.spage=105920&rft.epage=105920&rft.pages=105920-105920&rft.artnum=105920&rft.issn=0010-4825&rft.eissn=1879-0534&rft_id=info:doi/10.1016/j.compbiomed.2022.105920&rft_dat=%3Cproquest_cross%3E2704868744%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2715235802&rft_id=info:pmid/&rft_els_id=S001048252200662X&rfr_iscdi=true