A medical percussion instrument using a wavelet-based method for archivable output and automatic classification

There is no standard instrument for carrying out medical percussion even though the procedure has been in continuous use since 1761. This study developed one such instrument. It generates medical percussion sounds in a reproducible manner and accurately classifies them into one of three classes. Per...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers in biology and medicine 2020-12, Vol.127, p.104100-104100, Article 104100
Hauptverfasser: Ayodele, K.P., Ogunlade, O., Olugbon, O.J., Akinwale, O.B., Kehinde, L.O.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 104100
container_issue
container_start_page 104100
container_title Computers in biology and medicine
container_volume 127
creator Ayodele, K.P.
Ogunlade, O.
Olugbon, O.J.
Akinwale, O.B.
Kehinde, L.O.
description There is no standard instrument for carrying out medical percussion even though the procedure has been in continuous use since 1761. This study developed one such instrument. It generates medical percussion sounds in a reproducible manner and accurately classifies them into one of three classes. Percussion signals were generated using a push-pull solenoid plessor applying mechanical impulses through a polyvinyl chloride plessimeter. Signals were acquired using a National Instruments USB 6251 data acquisition card at a rate of 8.192 kHz through an air-coupled omnidirectional electret microphone located 60 mm from the impact site. Signal acquisition, processing, and classification were controlled by an NVIDIA Jetson TX2 computational device. A complex Morlet wavelet was selected as the base wavelet for the wavelet decomposition using the maximum wavelet energy method. It was also used to generate a scalogram suitable for manual or automatic classification. Automatic classification was achieved using a MobileNetv2 convolutional neural network with 17 inverted residual layers on the basis of 224 × 224 x 1 images generated by downsampling each scalogram. Testing was carried out using five human subjects with impulses applied at three thoracic sites each to elicit dull, resonant, and tympanic signals respectively. Classifier training utilized the Adam algorithm with a learning rate of 0.001, and first and second moments of 0.9 and 0.999 respectively for 100 epochs, with early stopping. Mean subject-specific validation and test accuracies of 95.9±1.6% and 93.8±2.3% respectively were obtained, along with cross-subject validation and test accuracies of 94.9% and 94.0% respectively. These results compare very favorably with previously-reported systems for automatic generation and classification of percussion sounds. •A system has been developed for accurate, reproducible medical percussography.•A complex Morlet base wavelet leads to an output that can be interpreted manually or automatically.•A classifier based on standard MobileNetv2 architecture was used for automatic classification.•Mean subject-specific validation accuracy of 95.9±1.6% and subject-specific test accuracy of 93.8±2.3% were achieved.•These are higher than classification accuracies reported by previous studies.
doi_str_mv 10.1016/j.compbiomed.2020.104100
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2459625419</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0010482520304315</els_id><sourcerecordid>2459625419</sourcerecordid><originalsourceid>FETCH-LOGICAL-c402t-c648c9988a00725ab0fcae92eb8c120e8898bb646bec6a8f8cfaefef6e36069b3</originalsourceid><addsrcrecordid>eNqFkc1u1DAURi0EokPbV0CW2LDJcO04jrMsFRSkSmxgbdnODfUoiYN_BvH2eDStKrFhZck-33evfAihDPYMmPxw2LuwbNaHBcc9B366FgzgBdkx1Q8NdK14SXYADBqheHdB3qR0AAABLbwmF23LesYH2JFwQ2uHd2amG0ZXUvJhpX5NOZYF10xL8utPauhvc8QZc2NNwrFm8kMY6RQiNdE9-KOxM9JQ8lYyNetITclhMdk76mZTS6c6ItfqK_JqMnPC68fzkvz4_On77Zfm_tvd19ub-8YJ4LlxUig3DEoZgJ53xsLkDA4crXKMAyo1KGulkBadNGpSbjI44SSxlSAH216S9-feLYZfBVPWi08O59msGErSXHSD5J1gQ0Xf_YMeQolr3a5SPROC91JWSp0pF0NKESe9Rb-Y-Ecz0Ccp-qCfpeiTFH2WUqNvHwcUe3p7Cj5ZqMDHM4D1R44eo07O4-qqmIgu6zH4_0_5CzQ6pLU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2471442766</pqid></control><display><type>article</type><title>A medical percussion instrument using a wavelet-based method for archivable output and automatic classification</title><source>Elsevier ScienceDirect Journals</source><creator>Ayodele, K.P. ; Ogunlade, O. ; Olugbon, O.J. ; Akinwale, O.B. ; Kehinde, L.O.</creator><creatorcontrib>Ayodele, K.P. ; Ogunlade, O. ; Olugbon, O.J. ; Akinwale, O.B. ; Kehinde, L.O.</creatorcontrib><description>There is no standard instrument for carrying out medical percussion even though the procedure has been in continuous use since 1761. This study developed one such instrument. It generates medical percussion sounds in a reproducible manner and accurately classifies them into one of three classes. Percussion signals were generated using a push-pull solenoid plessor applying mechanical impulses through a polyvinyl chloride plessimeter. Signals were acquired using a National Instruments USB 6251 data acquisition card at a rate of 8.192 kHz through an air-coupled omnidirectional electret microphone located 60 mm from the impact site. Signal acquisition, processing, and classification were controlled by an NVIDIA Jetson TX2 computational device. A complex Morlet wavelet was selected as the base wavelet for the wavelet decomposition using the maximum wavelet energy method. It was also used to generate a scalogram suitable for manual or automatic classification. Automatic classification was achieved using a MobileNetv2 convolutional neural network with 17 inverted residual layers on the basis of 224 × 224 x 1 images generated by downsampling each scalogram. Testing was carried out using five human subjects with impulses applied at three thoracic sites each to elicit dull, resonant, and tympanic signals respectively. Classifier training utilized the Adam algorithm with a learning rate of 0.001, and first and second moments of 0.9 and 0.999 respectively for 100 epochs, with early stopping. Mean subject-specific validation and test accuracies of 95.9±1.6% and 93.8±2.3% respectively were obtained, along with cross-subject validation and test accuracies of 94.9% and 94.0% respectively. These results compare very favorably with previously-reported systems for automatic generation and classification of percussion sounds. •A system has been developed for accurate, reproducible medical percussography.•A complex Morlet base wavelet leads to an output that can be interpreted manually or automatically.•A classifier based on standard MobileNetv2 architecture was used for automatic classification.•Mean subject-specific validation accuracy of 95.9±1.6% and subject-specific test accuracy of 93.8±2.3% were achieved.•These are higher than classification accuracies reported by previous studies.</description><identifier>ISSN: 0010-4825</identifier><identifier>EISSN: 1879-0534</identifier><identifier>DOI: 10.1016/j.compbiomed.2020.104100</identifier><identifier>PMID: 33171290</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Abdomen ; Acoustics ; Algorithms ; Artificial neural networks ; Classification ; Computer applications ; Convolutional neural network ; Data acquisition ; Energy methods ; Impulses ; Information processing ; Lungs ; Machine learning ; Medical percussion ; Methods ; MobileNetV2 ; Morlet wavelet ; Neural networks ; Percussion ; Percussograph ; Polyvinyl chloride ; Propagation ; Scalogram ; Signal classification ; Signal processing ; Solenoids ; Sound ; Thorax ; Wavelet analysis ; X-rays</subject><ispartof>Computers in biology and medicine, 2020-12, Vol.127, p.104100-104100, Article 104100</ispartof><rights>2020 Elsevier Ltd</rights><rights>Copyright © 2020 Elsevier Ltd. All rights reserved.</rights><rights>2020. Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c402t-c648c9988a00725ab0fcae92eb8c120e8898bb646bec6a8f8cfaefef6e36069b3</citedby><cites>FETCH-LOGICAL-c402t-c648c9988a00725ab0fcae92eb8c120e8898bb646bec6a8f8cfaefef6e36069b3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0010482520304315$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33171290$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Ayodele, K.P.</creatorcontrib><creatorcontrib>Ogunlade, O.</creatorcontrib><creatorcontrib>Olugbon, O.J.</creatorcontrib><creatorcontrib>Akinwale, O.B.</creatorcontrib><creatorcontrib>Kehinde, L.O.</creatorcontrib><title>A medical percussion instrument using a wavelet-based method for archivable output and automatic classification</title><title>Computers in biology and medicine</title><addtitle>Comput Biol Med</addtitle><description>There is no standard instrument for carrying out medical percussion even though the procedure has been in continuous use since 1761. This study developed one such instrument. It generates medical percussion sounds in a reproducible manner and accurately classifies them into one of three classes. Percussion signals were generated using a push-pull solenoid plessor applying mechanical impulses through a polyvinyl chloride plessimeter. Signals were acquired using a National Instruments USB 6251 data acquisition card at a rate of 8.192 kHz through an air-coupled omnidirectional electret microphone located 60 mm from the impact site. Signal acquisition, processing, and classification were controlled by an NVIDIA Jetson TX2 computational device. A complex Morlet wavelet was selected as the base wavelet for the wavelet decomposition using the maximum wavelet energy method. It was also used to generate a scalogram suitable for manual or automatic classification. Automatic classification was achieved using a MobileNetv2 convolutional neural network with 17 inverted residual layers on the basis of 224 × 224 x 1 images generated by downsampling each scalogram. Testing was carried out using five human subjects with impulses applied at three thoracic sites each to elicit dull, resonant, and tympanic signals respectively. Classifier training utilized the Adam algorithm with a learning rate of 0.001, and first and second moments of 0.9 and 0.999 respectively for 100 epochs, with early stopping. Mean subject-specific validation and test accuracies of 95.9±1.6% and 93.8±2.3% respectively were obtained, along with cross-subject validation and test accuracies of 94.9% and 94.0% respectively. These results compare very favorably with previously-reported systems for automatic generation and classification of percussion sounds. •A system has been developed for accurate, reproducible medical percussography.•A complex Morlet base wavelet leads to an output that can be interpreted manually or automatically.•A classifier based on standard MobileNetv2 architecture was used for automatic classification.•Mean subject-specific validation accuracy of 95.9±1.6% and subject-specific test accuracy of 93.8±2.3% were achieved.•These are higher than classification accuracies reported by previous studies.</description><subject>Abdomen</subject><subject>Acoustics</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Computer applications</subject><subject>Convolutional neural network</subject><subject>Data acquisition</subject><subject>Energy methods</subject><subject>Impulses</subject><subject>Information processing</subject><subject>Lungs</subject><subject>Machine learning</subject><subject>Medical percussion</subject><subject>Methods</subject><subject>MobileNetV2</subject><subject>Morlet wavelet</subject><subject>Neural networks</subject><subject>Percussion</subject><subject>Percussograph</subject><subject>Polyvinyl chloride</subject><subject>Propagation</subject><subject>Scalogram</subject><subject>Signal classification</subject><subject>Signal processing</subject><subject>Solenoids</subject><subject>Sound</subject><subject>Thorax</subject><subject>Wavelet analysis</subject><subject>X-rays</subject><issn>0010-4825</issn><issn>1879-0534</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqFkc1u1DAURi0EokPbV0CW2LDJcO04jrMsFRSkSmxgbdnODfUoiYN_BvH2eDStKrFhZck-33evfAihDPYMmPxw2LuwbNaHBcc9B366FgzgBdkx1Q8NdK14SXYADBqheHdB3qR0AAABLbwmF23LesYH2JFwQ2uHd2amG0ZXUvJhpX5NOZYF10xL8utPauhvc8QZc2NNwrFm8kMY6RQiNdE9-KOxM9JQ8lYyNetITclhMdk76mZTS6c6ItfqK_JqMnPC68fzkvz4_On77Zfm_tvd19ub-8YJ4LlxUig3DEoZgJ53xsLkDA4crXKMAyo1KGulkBadNGpSbjI44SSxlSAH216S9-feLYZfBVPWi08O59msGErSXHSD5J1gQ0Xf_YMeQolr3a5SPROC91JWSp0pF0NKESe9Rb-Y-Ecz0Ccp-qCfpeiTFH2WUqNvHwcUe3p7Cj5ZqMDHM4D1R44eo07O4-qqmIgu6zH4_0_5CzQ6pLU</recordid><startdate>202012</startdate><enddate>202012</enddate><creator>Ayodele, K.P.</creator><creator>Ogunlade, O.</creator><creator>Olugbon, O.J.</creator><creator>Akinwale, O.B.</creator><creator>Kehinde, L.O.</creator><general>Elsevier Ltd</general><general>Elsevier Limited</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M7P</scope><scope>M7Z</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope></search><sort><creationdate>202012</creationdate><title>A medical percussion instrument using a wavelet-based method for archivable output and automatic classification</title><author>Ayodele, K.P. ; Ogunlade, O. ; Olugbon, O.J. ; Akinwale, O.B. ; Kehinde, L.O.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c402t-c648c9988a00725ab0fcae92eb8c120e8898bb646bec6a8f8cfaefef6e36069b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Abdomen</topic><topic>Acoustics</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Computer applications</topic><topic>Convolutional neural network</topic><topic>Data acquisition</topic><topic>Energy methods</topic><topic>Impulses</topic><topic>Information processing</topic><topic>Lungs</topic><topic>Machine learning</topic><topic>Medical percussion</topic><topic>Methods</topic><topic>MobileNetV2</topic><topic>Morlet wavelet</topic><topic>Neural networks</topic><topic>Percussion</topic><topic>Percussograph</topic><topic>Polyvinyl chloride</topic><topic>Propagation</topic><topic>Scalogram</topic><topic>Signal classification</topic><topic>Signal processing</topic><topic>Solenoids</topic><topic>Sound</topic><topic>Thorax</topic><topic>Wavelet analysis</topic><topic>X-rays</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ayodele, K.P.</creatorcontrib><creatorcontrib>Ogunlade, O.</creatorcontrib><creatorcontrib>Olugbon, O.J.</creatorcontrib><creatorcontrib>Akinwale, O.B.</creatorcontrib><creatorcontrib>Kehinde, L.O.</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Research Library</collection><collection>Biological Science Database</collection><collection>Biochemistry Abstracts 1</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Computers in biology and medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ayodele, K.P.</au><au>Ogunlade, O.</au><au>Olugbon, O.J.</au><au>Akinwale, O.B.</au><au>Kehinde, L.O.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A medical percussion instrument using a wavelet-based method for archivable output and automatic classification</atitle><jtitle>Computers in biology and medicine</jtitle><addtitle>Comput Biol Med</addtitle><date>2020-12</date><risdate>2020</risdate><volume>127</volume><spage>104100</spage><epage>104100</epage><pages>104100-104100</pages><artnum>104100</artnum><issn>0010-4825</issn><eissn>1879-0534</eissn><abstract>There is no standard instrument for carrying out medical percussion even though the procedure has been in continuous use since 1761. This study developed one such instrument. It generates medical percussion sounds in a reproducible manner and accurately classifies them into one of three classes. Percussion signals were generated using a push-pull solenoid plessor applying mechanical impulses through a polyvinyl chloride plessimeter. Signals were acquired using a National Instruments USB 6251 data acquisition card at a rate of 8.192 kHz through an air-coupled omnidirectional electret microphone located 60 mm from the impact site. Signal acquisition, processing, and classification were controlled by an NVIDIA Jetson TX2 computational device. A complex Morlet wavelet was selected as the base wavelet for the wavelet decomposition using the maximum wavelet energy method. It was also used to generate a scalogram suitable for manual or automatic classification. Automatic classification was achieved using a MobileNetv2 convolutional neural network with 17 inverted residual layers on the basis of 224 × 224 x 1 images generated by downsampling each scalogram. Testing was carried out using five human subjects with impulses applied at three thoracic sites each to elicit dull, resonant, and tympanic signals respectively. Classifier training utilized the Adam algorithm with a learning rate of 0.001, and first and second moments of 0.9 and 0.999 respectively for 100 epochs, with early stopping. Mean subject-specific validation and test accuracies of 95.9±1.6% and 93.8±2.3% respectively were obtained, along with cross-subject validation and test accuracies of 94.9% and 94.0% respectively. These results compare very favorably with previously-reported systems for automatic generation and classification of percussion sounds. •A system has been developed for accurate, reproducible medical percussography.•A complex Morlet base wavelet leads to an output that can be interpreted manually or automatically.•A classifier based on standard MobileNetv2 architecture was used for automatic classification.•Mean subject-specific validation accuracy of 95.9±1.6% and subject-specific test accuracy of 93.8±2.3% were achieved.•These are higher than classification accuracies reported by previous studies.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>33171290</pmid><doi>10.1016/j.compbiomed.2020.104100</doi><tpages>1</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0010-4825
ispartof Computers in biology and medicine, 2020-12, Vol.127, p.104100-104100, Article 104100
issn 0010-4825
1879-0534
language eng
recordid cdi_proquest_miscellaneous_2459625419
source Elsevier ScienceDirect Journals
subjects Abdomen
Acoustics
Algorithms
Artificial neural networks
Classification
Computer applications
Convolutional neural network
Data acquisition
Energy methods
Impulses
Information processing
Lungs
Machine learning
Medical percussion
Methods
MobileNetV2
Morlet wavelet
Neural networks
Percussion
Percussograph
Polyvinyl chloride
Propagation
Scalogram
Signal classification
Signal processing
Solenoids
Sound
Thorax
Wavelet analysis
X-rays
title A medical percussion instrument using a wavelet-based method for archivable output and automatic classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-11T08%3A51%3A22IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20medical%20percussion%20instrument%20using%20a%20wavelet-based%20method%20for%20archivable%20output%20and%20automatic%20classification&rft.jtitle=Computers%20in%20biology%20and%20medicine&rft.au=Ayodele,%20K.P.&rft.date=2020-12&rft.volume=127&rft.spage=104100&rft.epage=104100&rft.pages=104100-104100&rft.artnum=104100&rft.issn=0010-4825&rft.eissn=1879-0534&rft_id=info:doi/10.1016/j.compbiomed.2020.104100&rft_dat=%3Cproquest_cross%3E2459625419%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2471442766&rft_id=info:pmid/33171290&rft_els_id=S0010482520304315&rfr_iscdi=true