A three-dimensional virtual mouse generates synthetic training data for behavioral analysis

We developed a three-dimensional (3D) synthetic animated mouse based on computed tomography scans that is actuated using animation and semirandom, joint-constrained movements to generate synthetic behavioral data with ground-truth label locations. Image-domain translation produced realistic syntheti...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nature methods 2021-04, Vol.18 (4), p.378-381
Hauptverfasser: Bolaños, Luis A., Xiao, Dongsheng, Ford, Nancy L., LeDue, Jeff M., Gupta, Pankaj K., Doebeli, Carlos, Hu, Hao, Rhodin, Helge, Murphy, Timothy H.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 381
container_issue 4
container_start_page 378
container_title Nature methods
container_volume 18
creator Bolaños, Luis A.
Xiao, Dongsheng
Ford, Nancy L.
LeDue, Jeff M.
Gupta, Pankaj K.
Doebeli, Carlos
Hu, Hao
Rhodin, Helge
Murphy, Timothy H.
description We developed a three-dimensional (3D) synthetic animated mouse based on computed tomography scans that is actuated using animation and semirandom, joint-constrained movements to generate synthetic behavioral data with ground-truth label locations. Image-domain translation produced realistic synthetic videos used to train two-dimensional (2D) and 3D pose estimation models with accuracy similar to typical manual training datasets. The outputs from the 3D model-based pose estimation yielded better definition of behavioral clusters than 2D videos and may facilitate automated ethological classification. Bolaños et al. present a realistic three-dimensional virtual mouse model that can be animated and that facilitates the training of pose estimation algorithms.
doi_str_mv 10.1038/s41592-021-01103-9
format Article
fullrecord <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_miscellaneous_2509271340</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A657782430</galeid><sourcerecordid>A657782430</sourcerecordid><originalsourceid>FETCH-LOGICAL-c486t-990037cac27657f24ac2fe381aaa2ce8951d920effafac60bb66e2b0fcf04be23</originalsourceid><addsrcrecordid>eNp9kUtr3TAQhUVpaR7tH-iiGLrpxqlefmh5CW1TCGSTrLIQsjy6V8GWUkkO3H-fuXXS0FKKFjOMvnOY4RDygdEzRkX_JUvWKF5TzmrKcFKrV-SYNbKvO0ab1889VeyInOR8R6kQkjdvyZEQPaeqV8fkdlOVXQKoRz9DyD4GM1UPPpUF6xyXDNUWAiRTIFd5H8oOirdVScYHH7bVaIqpXEzVADvz4GNCmUGPffb5HXnjzJTh_VM9JTffvl6fX9SXV99_nG8uayv7ttRK4WKdNZZ3bdM5LrFzIHpmjOEWetWwUXEKzhlnbEuHoW2BD9RZR-UAXJySz6vvfYo_F8hFzz5bmCYTAC_QvKGKd0xIiuinv9C7uCTcd6Uk77nkL9TWTKB9cBHvtQdTvcEVO4TEwevsHxS-EWZvYwDncf6HgK8Cm2LOCZy-T342aa8Z1YdE9ZqoxkT1r0S1QtHHp42XYYbxt-Q5QgTECmT8CltILyf9x_YR47OrZw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2509428242</pqid></control><display><type>article</type><title>A three-dimensional virtual mouse generates synthetic training data for behavioral analysis</title><source>MEDLINE</source><source>Nature</source><source>Alma/SFX Local Collection</source><creator>Bolaños, Luis A. ; Xiao, Dongsheng ; Ford, Nancy L. ; LeDue, Jeff M. ; Gupta, Pankaj K. ; Doebeli, Carlos ; Hu, Hao ; Rhodin, Helge ; Murphy, Timothy H.</creator><creatorcontrib>Bolaños, Luis A. ; Xiao, Dongsheng ; Ford, Nancy L. ; LeDue, Jeff M. ; Gupta, Pankaj K. ; Doebeli, Carlos ; Hu, Hao ; Rhodin, Helge ; Murphy, Timothy H.</creatorcontrib><description>We developed a three-dimensional (3D) synthetic animated mouse based on computed tomography scans that is actuated using animation and semirandom, joint-constrained movements to generate synthetic behavioral data with ground-truth label locations. Image-domain translation produced realistic synthetic videos used to train two-dimensional (2D) and 3D pose estimation models with accuracy similar to typical manual training datasets. The outputs from the 3D model-based pose estimation yielded better definition of behavioral clusters than 2D videos and may facilitate automated ethological classification. Bolaños et al. present a realistic three-dimensional virtual mouse model that can be animated and that facilitates the training of pose estimation algorithms.</description><identifier>ISSN: 1548-7091</identifier><identifier>EISSN: 1548-7105</identifier><identifier>DOI: 10.1038/s41592-021-01103-9</identifier><identifier>PMID: 33820989</identifier><language>eng</language><publisher>New York: Nature Publishing Group US</publisher><subject>631/114/1564 ; 631/1647/2198 ; 631/1647/334/1874/345 ; 631/1647/794 ; 631/378/2632 ; Algorithms ; Animal models ; Animal models in research ; Animals ; Animation ; Behavior ; Behavior, Animal ; Behavioral assessment ; Bioinformatics ; Biological Microscopy ; Biological Techniques ; Biomedical and Life Sciences ; Biomedical Engineering/Biotechnology ; Brief Communication ; Cameras ; Computed tomography ; Computer simulation ; CT imaging ; Female ; Imaging, Three-Dimensional - methods ; Life Sciences ; Lighting ; Machine Learning ; Mechanical properties ; Methods ; Mice ; Mice, Inbred C57BL ; Model accuracy ; Mouse devices ; Noise ; Proteomics ; Reinforcement learning (Machine learning) ; Skin ; Technology application ; Three dimensional models ; Training ; Two dimensional models ; Video ; Virtual reality</subject><ispartof>Nature methods, 2021-04, Vol.18 (4), p.378-381</ispartof><rights>The Author(s), under exclusive licence to Springer Nature America, Inc. 2021</rights><rights>COPYRIGHT 2021 Nature Publishing Group</rights><rights>The Author(s), under exclusive licence to Springer Nature America, Inc. 2021.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c486t-990037cac27657f24ac2fe381aaa2ce8951d920effafac60bb66e2b0fcf04be23</citedby><cites>FETCH-LOGICAL-c486t-990037cac27657f24ac2fe381aaa2ce8951d920effafac60bb66e2b0fcf04be23</cites><orcidid>0000-0001-6814-2812 ; 0000-0002-0093-4490 ; 0000-0002-1669-0021 ; 0000-0003-2692-0801</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33820989$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Bolaños, Luis A.</creatorcontrib><creatorcontrib>Xiao, Dongsheng</creatorcontrib><creatorcontrib>Ford, Nancy L.</creatorcontrib><creatorcontrib>LeDue, Jeff M.</creatorcontrib><creatorcontrib>Gupta, Pankaj K.</creatorcontrib><creatorcontrib>Doebeli, Carlos</creatorcontrib><creatorcontrib>Hu, Hao</creatorcontrib><creatorcontrib>Rhodin, Helge</creatorcontrib><creatorcontrib>Murphy, Timothy H.</creatorcontrib><title>A three-dimensional virtual mouse generates synthetic training data for behavioral analysis</title><title>Nature methods</title><addtitle>Nat Methods</addtitle><addtitle>Nat Methods</addtitle><description>We developed a three-dimensional (3D) synthetic animated mouse based on computed tomography scans that is actuated using animation and semirandom, joint-constrained movements to generate synthetic behavioral data with ground-truth label locations. Image-domain translation produced realistic synthetic videos used to train two-dimensional (2D) and 3D pose estimation models with accuracy similar to typical manual training datasets. The outputs from the 3D model-based pose estimation yielded better definition of behavioral clusters than 2D videos and may facilitate automated ethological classification. Bolaños et al. present a realistic three-dimensional virtual mouse model that can be animated and that facilitates the training of pose estimation algorithms.</description><subject>631/114/1564</subject><subject>631/1647/2198</subject><subject>631/1647/334/1874/345</subject><subject>631/1647/794</subject><subject>631/378/2632</subject><subject>Algorithms</subject><subject>Animal models</subject><subject>Animal models in research</subject><subject>Animals</subject><subject>Animation</subject><subject>Behavior</subject><subject>Behavior, Animal</subject><subject>Behavioral assessment</subject><subject>Bioinformatics</subject><subject>Biological Microscopy</subject><subject>Biological Techniques</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedical Engineering/Biotechnology</subject><subject>Brief Communication</subject><subject>Cameras</subject><subject>Computed tomography</subject><subject>Computer simulation</subject><subject>CT imaging</subject><subject>Female</subject><subject>Imaging, Three-Dimensional - methods</subject><subject>Life Sciences</subject><subject>Lighting</subject><subject>Machine Learning</subject><subject>Mechanical properties</subject><subject>Methods</subject><subject>Mice</subject><subject>Mice, Inbred C57BL</subject><subject>Model accuracy</subject><subject>Mouse devices</subject><subject>Noise</subject><subject>Proteomics</subject><subject>Reinforcement learning (Machine learning)</subject><subject>Skin</subject><subject>Technology application</subject><subject>Three dimensional models</subject><subject>Training</subject><subject>Two dimensional models</subject><subject>Video</subject><subject>Virtual reality</subject><issn>1548-7091</issn><issn>1548-7105</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kUtr3TAQhUVpaR7tH-iiGLrpxqlefmh5CW1TCGSTrLIQsjy6V8GWUkkO3H-fuXXS0FKKFjOMvnOY4RDygdEzRkX_JUvWKF5TzmrKcFKrV-SYNbKvO0ab1889VeyInOR8R6kQkjdvyZEQPaeqV8fkdlOVXQKoRz9DyD4GM1UPPpUF6xyXDNUWAiRTIFd5H8oOirdVScYHH7bVaIqpXEzVADvz4GNCmUGPffb5HXnjzJTh_VM9JTffvl6fX9SXV99_nG8uayv7ttRK4WKdNZZ3bdM5LrFzIHpmjOEWetWwUXEKzhlnbEuHoW2BD9RZR-UAXJySz6vvfYo_F8hFzz5bmCYTAC_QvKGKd0xIiuinv9C7uCTcd6Uk77nkL9TWTKB9cBHvtQdTvcEVO4TEwevsHxS-EWZvYwDncf6HgK8Cm2LOCZy-T342aa8Z1YdE9ZqoxkT1r0S1QtHHp42XYYbxt-Q5QgTECmT8CltILyf9x_YR47OrZw</recordid><startdate>20210401</startdate><enddate>20210401</enddate><creator>Bolaños, Luis A.</creator><creator>Xiao, Dongsheng</creator><creator>Ford, Nancy L.</creator><creator>LeDue, Jeff M.</creator><creator>Gupta, Pankaj K.</creator><creator>Doebeli, Carlos</creator><creator>Hu, Hao</creator><creator>Rhodin, Helge</creator><creator>Murphy, Timothy H.</creator><general>Nature Publishing Group US</general><general>Nature Publishing Group</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QL</scope><scope>7QO</scope><scope>7SS</scope><scope>7TK</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>88I</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PDBOC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-6814-2812</orcidid><orcidid>https://orcid.org/0000-0002-0093-4490</orcidid><orcidid>https://orcid.org/0000-0002-1669-0021</orcidid><orcidid>https://orcid.org/0000-0003-2692-0801</orcidid></search><sort><creationdate>20210401</creationdate><title>A three-dimensional virtual mouse generates synthetic training data for behavioral analysis</title><author>Bolaños, Luis A. ; Xiao, Dongsheng ; Ford, Nancy L. ; LeDue, Jeff M. ; Gupta, Pankaj K. ; Doebeli, Carlos ; Hu, Hao ; Rhodin, Helge ; Murphy, Timothy H.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c486t-990037cac27657f24ac2fe381aaa2ce8951d920effafac60bb66e2b0fcf04be23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>631/114/1564</topic><topic>631/1647/2198</topic><topic>631/1647/334/1874/345</topic><topic>631/1647/794</topic><topic>631/378/2632</topic><topic>Algorithms</topic><topic>Animal models</topic><topic>Animal models in research</topic><topic>Animals</topic><topic>Animation</topic><topic>Behavior</topic><topic>Behavior, Animal</topic><topic>Behavioral assessment</topic><topic>Bioinformatics</topic><topic>Biological Microscopy</topic><topic>Biological Techniques</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedical Engineering/Biotechnology</topic><topic>Brief Communication</topic><topic>Cameras</topic><topic>Computed tomography</topic><topic>Computer simulation</topic><topic>CT imaging</topic><topic>Female</topic><topic>Imaging, Three-Dimensional - methods</topic><topic>Life Sciences</topic><topic>Lighting</topic><topic>Machine Learning</topic><topic>Mechanical properties</topic><topic>Methods</topic><topic>Mice</topic><topic>Mice, Inbred C57BL</topic><topic>Model accuracy</topic><topic>Mouse devices</topic><topic>Noise</topic><topic>Proteomics</topic><topic>Reinforcement learning (Machine learning)</topic><topic>Skin</topic><topic>Technology application</topic><topic>Three dimensional models</topic><topic>Training</topic><topic>Two dimensional models</topic><topic>Video</topic><topic>Virtual reality</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bolaños, Luis A.</creatorcontrib><creatorcontrib>Xiao, Dongsheng</creatorcontrib><creatorcontrib>Ford, Nancy L.</creatorcontrib><creatorcontrib>LeDue, Jeff M.</creatorcontrib><creatorcontrib>Gupta, Pankaj K.</creatorcontrib><creatorcontrib>Doebeli, Carlos</creatorcontrib><creatorcontrib>Hu, Hao</creatorcontrib><creatorcontrib>Rhodin, Helge</creatorcontrib><creatorcontrib>Murphy, Timothy H.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Neurosciences Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Earth, Atmospheric &amp; Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Science Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric &amp; Aquatic Science Database</collection><collection>Materials Science Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Nature methods</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bolaños, Luis A.</au><au>Xiao, Dongsheng</au><au>Ford, Nancy L.</au><au>LeDue, Jeff M.</au><au>Gupta, Pankaj K.</au><au>Doebeli, Carlos</au><au>Hu, Hao</au><au>Rhodin, Helge</au><au>Murphy, Timothy H.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A three-dimensional virtual mouse generates synthetic training data for behavioral analysis</atitle><jtitle>Nature methods</jtitle><stitle>Nat Methods</stitle><addtitle>Nat Methods</addtitle><date>2021-04-01</date><risdate>2021</risdate><volume>18</volume><issue>4</issue><spage>378</spage><epage>381</epage><pages>378-381</pages><issn>1548-7091</issn><eissn>1548-7105</eissn><abstract>We developed a three-dimensional (3D) synthetic animated mouse based on computed tomography scans that is actuated using animation and semirandom, joint-constrained movements to generate synthetic behavioral data with ground-truth label locations. Image-domain translation produced realistic synthetic videos used to train two-dimensional (2D) and 3D pose estimation models with accuracy similar to typical manual training datasets. The outputs from the 3D model-based pose estimation yielded better definition of behavioral clusters than 2D videos and may facilitate automated ethological classification. Bolaños et al. present a realistic three-dimensional virtual mouse model that can be animated and that facilitates the training of pose estimation algorithms.</abstract><cop>New York</cop><pub>Nature Publishing Group US</pub><pmid>33820989</pmid><doi>10.1038/s41592-021-01103-9</doi><tpages>4</tpages><orcidid>https://orcid.org/0000-0001-6814-2812</orcidid><orcidid>https://orcid.org/0000-0002-0093-4490</orcidid><orcidid>https://orcid.org/0000-0002-1669-0021</orcidid><orcidid>https://orcid.org/0000-0003-2692-0801</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1548-7091
ispartof Nature methods, 2021-04, Vol.18 (4), p.378-381
issn 1548-7091
1548-7105
language eng
recordid cdi_proquest_miscellaneous_2509271340
source MEDLINE; Nature; Alma/SFX Local Collection
subjects 631/114/1564
631/1647/2198
631/1647/334/1874/345
631/1647/794
631/378/2632
Algorithms
Animal models
Animal models in research
Animals
Animation
Behavior
Behavior, Animal
Behavioral assessment
Bioinformatics
Biological Microscopy
Biological Techniques
Biomedical and Life Sciences
Biomedical Engineering/Biotechnology
Brief Communication
Cameras
Computed tomography
Computer simulation
CT imaging
Female
Imaging, Three-Dimensional - methods
Life Sciences
Lighting
Machine Learning
Mechanical properties
Methods
Mice
Mice, Inbred C57BL
Model accuracy
Mouse devices
Noise
Proteomics
Reinforcement learning (Machine learning)
Skin
Technology application
Three dimensional models
Training
Two dimensional models
Video
Virtual reality
title A three-dimensional virtual mouse generates synthetic training data for behavioral analysis
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T16%3A43%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20three-dimensional%20virtual%20mouse%20generates%20synthetic%20training%20data%20for%20behavioral%20analysis&rft.jtitle=Nature%20methods&rft.au=Bola%C3%B1os,%20Luis%20A.&rft.date=2021-04-01&rft.volume=18&rft.issue=4&rft.spage=378&rft.epage=381&rft.pages=378-381&rft.issn=1548-7091&rft.eissn=1548-7105&rft_id=info:doi/10.1038/s41592-021-01103-9&rft_dat=%3Cgale_proqu%3EA657782430%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2509428242&rft_id=info:pmid/33820989&rft_galeid=A657782430&rfr_iscdi=true