Two-dimensional video-based analysis of human gait using pose estimation

Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PLoS computational biology 2021-04, Vol.17 (4), p.e1008935-e1008935
Hauptverfasser: Stenum, Jan, Rossi, Cristina, Roemmich, Ryan T
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page e1008935
container_issue 4
container_start_page e1008935
container_title PLoS computational biology
container_volume 17
creator Stenum, Jan
Rossi, Cristina
Roemmich, Ryan T
description Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using two-dimensional video collected from readily accessible devices (e.g., smartphones). To date, several studies have extracted features of human gait using markerless pose estimation. However, we currently lack evaluation of video-based approaches using a dataset of human gait for a wide range of gait parameters on a stride-by-stride basis and a workflow for performing gait analysis from video. Here, we compared spatiotemporal and sagittal kinematic gait parameters measured with OpenPose (open-source video-based human pose estimation) against simultaneously recorded three-dimensional motion capture from overground walking of healthy adults. When assessing all individual steps in the walking bouts, we observed mean absolute errors between motion capture and OpenPose of 0.02 s for temporal gait parameters (i.e., step time, stance time, swing time and double support time) and 0.049 m for step lengths. Accuracy improved when spatiotemporal gait parameters were calculated as individual participant mean values: mean absolute error was 0.01 s for temporal gait parameters and 0.018 m for step lengths. The greatest difference in gait speed between motion capture and OpenPose was less than 0.10 m s-1. Mean absolute error of sagittal plane hip, knee and ankle angles between motion capture and OpenPose were 4.0°, 5.6° and 7.4°. Our analysis workflow is freely available, involves minimal user input, and does not require prior gait analysis expertise. Finally, we offer suggestions and considerations for future applications of pose estimation for human gait analysis.
doi_str_mv 10.1371/journal.pcbi.1008935
format Article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2528201520</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A660614929</galeid><doaj_id>oai_doaj_org_article_9dedf2adcdc34bd1b410d95000e23c95</doaj_id><sourcerecordid>A660614929</sourcerecordid><originalsourceid>FETCH-LOGICAL-c727t-c92f51e32a1d9d73da1e19caef1e48a46eb189676331f472aeec7a296d023c5c3</originalsourceid><addsrcrecordid>eNqVkk1v1DAQhiMEoqXwDxBE4gKHLB47TuILUlUBXakCCcrZmtiT1KskXuKk0H-Pl02rLuoF-WBr_LzvfGiS5CWwFYgS3m_8PA7YrbamditgrFJCPkqOQUqRlUJWj--9j5JnIWwYi09VPE2OhKgUyEoeJ-eXv3xmXU9DcD7apdfOks9qDGRTjIGb4ELqm_Rq7nFIW3RTOgc3tOnWB0opTK7HKUqfJ08a7AK9WO6T5Menj5dn59nF18_rs9OLzJS8nDKjeCOBBEewypbCIhAog9QA5RXmBdUQaywLIaDJS45EpkSuCsu4MNKIk-T13nfb-aCXIQTNJa84A8lZJNZ7wnrc6O0YCxxvtEen_wb82GocJ2c60sqSbThaY43Iawt1DswqyRijmE3J6PVhyTbXPVlDwzRid2B6-DO4K936a10xpUBANHi7GIz-5xzHpXsXDHUdDuTnXd1QcSgE29X95h_04e4WqsXYgBsaH_Oanak-LQpWQK64itTqASoeS70zfqDGxfiB4N2BIDIT_Z5anEPQ6-_f_oP9csjme9aMPoSRmrvZAdO7Pb5tUu_2WC97HGWv7s_9TnS7uOIPg7buJA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2528201520</pqid></control><display><type>article</type><title>Two-dimensional video-based analysis of human gait using pose estimation</title><source>DOAJ Directory of Open Access Journals</source><source>Public Library of Science (PLoS) Journals Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><creator>Stenum, Jan ; Rossi, Cristina ; Roemmich, Ryan T</creator><contributor>Schneidman-Duhovny, Dina</contributor><creatorcontrib>Stenum, Jan ; Rossi, Cristina ; Roemmich, Ryan T ; Schneidman-Duhovny, Dina</creatorcontrib><description>Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using two-dimensional video collected from readily accessible devices (e.g., smartphones). To date, several studies have extracted features of human gait using markerless pose estimation. However, we currently lack evaluation of video-based approaches using a dataset of human gait for a wide range of gait parameters on a stride-by-stride basis and a workflow for performing gait analysis from video. Here, we compared spatiotemporal and sagittal kinematic gait parameters measured with OpenPose (open-source video-based human pose estimation) against simultaneously recorded three-dimensional motion capture from overground walking of healthy adults. When assessing all individual steps in the walking bouts, we observed mean absolute errors between motion capture and OpenPose of 0.02 s for temporal gait parameters (i.e., step time, stance time, swing time and double support time) and 0.049 m for step lengths. Accuracy improved when spatiotemporal gait parameters were calculated as individual participant mean values: mean absolute error was 0.01 s for temporal gait parameters and 0.018 m for step lengths. The greatest difference in gait speed between motion capture and OpenPose was less than 0.10 m s-1. Mean absolute error of sagittal plane hip, knee and ankle angles between motion capture and OpenPose were 4.0°, 5.6° and 7.4°. Our analysis workflow is freely available, involves minimal user input, and does not require prior gait analysis expertise. Finally, we offer suggestions and considerations for future applications of pose estimation for human gait analysis.</description><identifier>ISSN: 1553-7358</identifier><identifier>ISSN: 1553-734X</identifier><identifier>EISSN: 1553-7358</identifier><identifier>DOI: 10.1371/journal.pcbi.1008935</identifier><identifier>PMID: 33891585</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Algorithms ; Analysis ; Automation ; Biology and Life Sciences ; Computer graphics ; Datasets ; Digital video ; Engineering and Technology ; Error analysis ; Frames (data processing) ; Gait ; Gait recognition ; Heels ; Human locomotion ; Human motion ; Human performance ; Kinematics ; Laboratories ; Medicine and Health Sciences ; Methods ; Motion capture ; Parameters ; Pose estimation ; Research and Analysis Methods ; Software ; Three dimensional motion ; Two dimensional analysis ; Workflow</subject><ispartof>PLoS computational biology, 2021-04, Vol.17 (4), p.e1008935-e1008935</ispartof><rights>COPYRIGHT 2021 Public Library of Science</rights><rights>2021 Stenum et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2021 Stenum et al 2021 Stenum et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c727t-c92f51e32a1d9d73da1e19caef1e48a46eb189676331f472aeec7a296d023c5c3</citedby><cites>FETCH-LOGICAL-c727t-c92f51e32a1d9d73da1e19caef1e48a46eb189676331f472aeec7a296d023c5c3</cites><orcidid>0000-0003-0797-6455 ; 0000-0001-7883-1945 ; 0000-0002-0088-8703</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8099131/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8099131/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,2102,2928,23866,27924,27925,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33891585$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Schneidman-Duhovny, Dina</contributor><creatorcontrib>Stenum, Jan</creatorcontrib><creatorcontrib>Rossi, Cristina</creatorcontrib><creatorcontrib>Roemmich, Ryan T</creatorcontrib><title>Two-dimensional video-based analysis of human gait using pose estimation</title><title>PLoS computational biology</title><addtitle>PLoS Comput Biol</addtitle><description>Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using two-dimensional video collected from readily accessible devices (e.g., smartphones). To date, several studies have extracted features of human gait using markerless pose estimation. However, we currently lack evaluation of video-based approaches using a dataset of human gait for a wide range of gait parameters on a stride-by-stride basis and a workflow for performing gait analysis from video. Here, we compared spatiotemporal and sagittal kinematic gait parameters measured with OpenPose (open-source video-based human pose estimation) against simultaneously recorded three-dimensional motion capture from overground walking of healthy adults. When assessing all individual steps in the walking bouts, we observed mean absolute errors between motion capture and OpenPose of 0.02 s for temporal gait parameters (i.e., step time, stance time, swing time and double support time) and 0.049 m for step lengths. Accuracy improved when spatiotemporal gait parameters were calculated as individual participant mean values: mean absolute error was 0.01 s for temporal gait parameters and 0.018 m for step lengths. The greatest difference in gait speed between motion capture and OpenPose was less than 0.10 m s-1. Mean absolute error of sagittal plane hip, knee and ankle angles between motion capture and OpenPose were 4.0°, 5.6° and 7.4°. Our analysis workflow is freely available, involves minimal user input, and does not require prior gait analysis expertise. Finally, we offer suggestions and considerations for future applications of pose estimation for human gait analysis.</description><subject>Algorithms</subject><subject>Analysis</subject><subject>Automation</subject><subject>Biology and Life Sciences</subject><subject>Computer graphics</subject><subject>Datasets</subject><subject>Digital video</subject><subject>Engineering and Technology</subject><subject>Error analysis</subject><subject>Frames (data processing)</subject><subject>Gait</subject><subject>Gait recognition</subject><subject>Heels</subject><subject>Human locomotion</subject><subject>Human motion</subject><subject>Human performance</subject><subject>Kinematics</subject><subject>Laboratories</subject><subject>Medicine and Health Sciences</subject><subject>Methods</subject><subject>Motion capture</subject><subject>Parameters</subject><subject>Pose estimation</subject><subject>Research and Analysis Methods</subject><subject>Software</subject><subject>Three dimensional motion</subject><subject>Two dimensional analysis</subject><subject>Workflow</subject><issn>1553-7358</issn><issn>1553-734X</issn><issn>1553-7358</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqVkk1v1DAQhiMEoqXwDxBE4gKHLB47TuILUlUBXakCCcrZmtiT1KskXuKk0H-Pl02rLuoF-WBr_LzvfGiS5CWwFYgS3m_8PA7YrbamditgrFJCPkqOQUqRlUJWj--9j5JnIWwYi09VPE2OhKgUyEoeJ-eXv3xmXU9DcD7apdfOks9qDGRTjIGb4ELqm_Rq7nFIW3RTOgc3tOnWB0opTK7HKUqfJ08a7AK9WO6T5Menj5dn59nF18_rs9OLzJS8nDKjeCOBBEewypbCIhAog9QA5RXmBdUQaywLIaDJS45EpkSuCsu4MNKIk-T13nfb-aCXIQTNJa84A8lZJNZ7wnrc6O0YCxxvtEen_wb82GocJ2c60sqSbThaY43Iawt1DswqyRijmE3J6PVhyTbXPVlDwzRid2B6-DO4K936a10xpUBANHi7GIz-5xzHpXsXDHUdDuTnXd1QcSgE29X95h_04e4WqsXYgBsaH_Oanak-LQpWQK64itTqASoeS70zfqDGxfiB4N2BIDIT_Z5anEPQ6-_f_oP9csjme9aMPoSRmrvZAdO7Pb5tUu_2WC97HGWv7s_9TnS7uOIPg7buJA</recordid><startdate>20210423</startdate><enddate>20210423</enddate><creator>Stenum, Jan</creator><creator>Rossi, Cristina</creator><creator>Roemmich, Ryan T</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>ISN</scope><scope>ISR</scope><scope>3V.</scope><scope>7QO</scope><scope>7QP</scope><scope>7TK</scope><scope>7TM</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-0797-6455</orcidid><orcidid>https://orcid.org/0000-0001-7883-1945</orcidid><orcidid>https://orcid.org/0000-0002-0088-8703</orcidid></search><sort><creationdate>20210423</creationdate><title>Two-dimensional video-based analysis of human gait using pose estimation</title><author>Stenum, Jan ; Rossi, Cristina ; Roemmich, Ryan T</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c727t-c92f51e32a1d9d73da1e19caef1e48a46eb189676331f472aeec7a296d023c5c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Analysis</topic><topic>Automation</topic><topic>Biology and Life Sciences</topic><topic>Computer graphics</topic><topic>Datasets</topic><topic>Digital video</topic><topic>Engineering and Technology</topic><topic>Error analysis</topic><topic>Frames (data processing)</topic><topic>Gait</topic><topic>Gait recognition</topic><topic>Heels</topic><topic>Human locomotion</topic><topic>Human motion</topic><topic>Human performance</topic><topic>Kinematics</topic><topic>Laboratories</topic><topic>Medicine and Health Sciences</topic><topic>Methods</topic><topic>Motion capture</topic><topic>Parameters</topic><topic>Pose estimation</topic><topic>Research and Analysis Methods</topic><topic>Software</topic><topic>Three dimensional motion</topic><topic>Two dimensional analysis</topic><topic>Workflow</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Stenum, Jan</creatorcontrib><creatorcontrib>Rossi, Cristina</creatorcontrib><creatorcontrib>Roemmich, Ryan T</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Canada</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PLoS computational biology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Stenum, Jan</au><au>Rossi, Cristina</au><au>Roemmich, Ryan T</au><au>Schneidman-Duhovny, Dina</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Two-dimensional video-based analysis of human gait using pose estimation</atitle><jtitle>PLoS computational biology</jtitle><addtitle>PLoS Comput Biol</addtitle><date>2021-04-23</date><risdate>2021</risdate><volume>17</volume><issue>4</issue><spage>e1008935</spage><epage>e1008935</epage><pages>e1008935-e1008935</pages><issn>1553-7358</issn><issn>1553-734X</issn><eissn>1553-7358</eissn><abstract>Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using two-dimensional video collected from readily accessible devices (e.g., smartphones). To date, several studies have extracted features of human gait using markerless pose estimation. However, we currently lack evaluation of video-based approaches using a dataset of human gait for a wide range of gait parameters on a stride-by-stride basis and a workflow for performing gait analysis from video. Here, we compared spatiotemporal and sagittal kinematic gait parameters measured with OpenPose (open-source video-based human pose estimation) against simultaneously recorded three-dimensional motion capture from overground walking of healthy adults. When assessing all individual steps in the walking bouts, we observed mean absolute errors between motion capture and OpenPose of 0.02 s for temporal gait parameters (i.e., step time, stance time, swing time and double support time) and 0.049 m for step lengths. Accuracy improved when spatiotemporal gait parameters were calculated as individual participant mean values: mean absolute error was 0.01 s for temporal gait parameters and 0.018 m for step lengths. The greatest difference in gait speed between motion capture and OpenPose was less than 0.10 m s-1. Mean absolute error of sagittal plane hip, knee and ankle angles between motion capture and OpenPose were 4.0°, 5.6° and 7.4°. Our analysis workflow is freely available, involves minimal user input, and does not require prior gait analysis expertise. Finally, we offer suggestions and considerations for future applications of pose estimation for human gait analysis.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>33891585</pmid><doi>10.1371/journal.pcbi.1008935</doi><orcidid>https://orcid.org/0000-0003-0797-6455</orcidid><orcidid>https://orcid.org/0000-0001-7883-1945</orcidid><orcidid>https://orcid.org/0000-0002-0088-8703</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1553-7358
ispartof PLoS computational biology, 2021-04, Vol.17 (4), p.e1008935-e1008935
issn 1553-7358
1553-734X
1553-7358
language eng
recordid cdi_plos_journals_2528201520
source DOAJ Directory of Open Access Journals; Public Library of Science (PLoS) Journals Open Access; EZB-FREE-00999 freely available EZB journals; PubMed Central
subjects Algorithms
Analysis
Automation
Biology and Life Sciences
Computer graphics
Datasets
Digital video
Engineering and Technology
Error analysis
Frames (data processing)
Gait
Gait recognition
Heels
Human locomotion
Human motion
Human performance
Kinematics
Laboratories
Medicine and Health Sciences
Methods
Motion capture
Parameters
Pose estimation
Research and Analysis Methods
Software
Three dimensional motion
Two dimensional analysis
Workflow
title Two-dimensional video-based analysis of human gait using pose estimation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-22T10%3A43%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Two-dimensional%20video-based%20analysis%20of%20human%20gait%20using%20pose%20estimation&rft.jtitle=PLoS%20computational%20biology&rft.au=Stenum,%20Jan&rft.date=2021-04-23&rft.volume=17&rft.issue=4&rft.spage=e1008935&rft.epage=e1008935&rft.pages=e1008935-e1008935&rft.issn=1553-7358&rft.eissn=1553-7358&rft_id=info:doi/10.1371/journal.pcbi.1008935&rft_dat=%3Cgale_plos_%3EA660614929%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2528201520&rft_id=info:pmid/33891585&rft_galeid=A660614929&rft_doaj_id=oai_doaj_org_article_9dedf2adcdc34bd1b410d95000e23c95&rfr_iscdi=true