Classification of multiple emotional states from facial expressions in head-fixed mice using a deep learning-based image analysis

Facial expressions are widely recognized as universal indicators of underlying internal states in most species of animals, thereby presenting as a non-invasive measure for assessing physical and mental conditions. Despite the advancement of artificial intelligence-assisted tools for automated analys...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PloS one 2023-07, Vol.18 (7), p.e0288930-e0288930
Hauptverfasser: Tanaka, Yudai, Nakata, Takuto, Hibino, Hiroshi, Nishiyama, Masaaki, Ino, Daisuke
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page e0288930
container_issue 7
container_start_page e0288930
container_title PloS one
container_volume 18
creator Tanaka, Yudai
Nakata, Takuto
Hibino, Hiroshi
Nishiyama, Masaaki
Ino, Daisuke
description Facial expressions are widely recognized as universal indicators of underlying internal states in most species of animals, thereby presenting as a non-invasive measure for assessing physical and mental conditions. Despite the advancement of artificial intelligence-assisted tools for automated analysis of voluminous facial expression data in human subjects, the corresponding tools for mice still remain limited so far. Considering that mice are the most prevalent model animals for studying human health and diseases, a comprehensive characterization of emotion-dependent patterns of facial expressions in mice could extend our knowledge on the basis of emotions and the related disorders. Here, we present a framework for the development of a deep learning-powered tool for classifying facial expressions in head-fixed mouse. We demonstrate that our machine vision was capable of accurately classifying three different emotional states from lateral facial images in head-fixed mouse. Moreover, we objectively determined how our classifier characterized the differences among the facial images through the use of an interpretation technique called Gradient-weighted Class Activation Mapping. Importantly, our machine vision presumably discerned the data by leveraging multiple facial features. Our approach is likely to facilitate the non-invasive decoding of a variety of emotions from facial images in head-fixed mice.
doi_str_mv 10.1371/journal.pone.0288930
format Article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2840217098</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A757855012</galeid><sourcerecordid>A757855012</sourcerecordid><originalsourceid>FETCH-LOGICAL-c620t-bccedf378e1a5c4e19a452e970aa82391b591033c3e45d6b0cd36dccbcb370bf3</originalsourceid><addsrcrecordid>eNqNkk1v1DAQhiMEoqXwDxBYQkJwyGLHySY5oWrFR6VKlfi6WhNnvOvKsUMmQdsj_xwvm1a7qAeUQ5Lx875jj98keS74QshSvLsO0-DBLfrgccGzqqolf5Ccilpm6TLj8uHB90nyhOia80JWy-Xj5ESWeSlkJU6T3ysHRNZYDaMNngXDusmNtnfIsAu7GjhGI4xIzAyhYwa0jSXc9gNGZfDErGcbhDY1dost66xGNpH1awasReyZQxh8_E8boAjYDtbIIBrfkKWnySMDjvDZ_D5Lvn_88G31Ob28-nSxOr9MdTzBmDZaY2tkWaGAQucoasiLDOuSA1SZrEVT1IJLqSXmRbtsuG7lstW60Y0seWPkWfJy79u7QGqeHqmsynkmSl5XkXg_E1PTYavRjwM41Q9xw8ONCmDV8Yq3G7UOv1TsW9RcZNHhzewwhJ8T0qg6SxqdA49h-ttMcFFnBY_oq3_Q-7c0U2twqKw3ITbWO1N1XhZlVRT7tot7qPi0GC8j5sPYWD8SvD0SRGbE7biGiUhdfP3y_-zVj2P29QEbM-HGDQU37WJEx2C-B_UQiAY0d1MWXO3ifTsNtYu3muMdZS8Ob-hOdJtn-QchHffK</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2840217098</pqid></control><display><type>article</type><title>Classification of multiple emotional states from facial expressions in head-fixed mice using a deep learning-based image analysis</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Public Library of Science (PLoS) Journals Open Access</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><creator>Tanaka, Yudai ; Nakata, Takuto ; Hibino, Hiroshi ; Nishiyama, Masaaki ; Ino, Daisuke</creator><contributor>Srinivasan, Kathiravan</contributor><creatorcontrib>Tanaka, Yudai ; Nakata, Takuto ; Hibino, Hiroshi ; Nishiyama, Masaaki ; Ino, Daisuke ; Srinivasan, Kathiravan</creatorcontrib><description>Facial expressions are widely recognized as universal indicators of underlying internal states in most species of animals, thereby presenting as a non-invasive measure for assessing physical and mental conditions. Despite the advancement of artificial intelligence-assisted tools for automated analysis of voluminous facial expression data in human subjects, the corresponding tools for mice still remain limited so far. Considering that mice are the most prevalent model animals for studying human health and diseases, a comprehensive characterization of emotion-dependent patterns of facial expressions in mice could extend our knowledge on the basis of emotions and the related disorders. Here, we present a framework for the development of a deep learning-powered tool for classifying facial expressions in head-fixed mouse. We demonstrate that our machine vision was capable of accurately classifying three different emotional states from lateral facial images in head-fixed mouse. Moreover, we objectively determined how our classifier characterized the differences among the facial images through the use of an interpretation technique called Gradient-weighted Class Activation Mapping. Importantly, our machine vision presumably discerned the data by leveraging multiple facial features. Our approach is likely to facilitate the non-invasive decoding of a variety of emotions from facial images in head-fixed mice.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0288930</identifier><identifier>PMID: 37471381</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Analysis ; Animal species ; Animals ; Artificial Intelligence ; Biology and Life Sciences ; Classification ; Datasets ; Deep Learning ; Emotion regulation ; Emotional factors ; Emotions ; Emotions - physiology ; Face ; Facial Expression ; Humans ; Image analysis ; Image processing ; Machine learning ; Machine vision ; Medicine and Health Sciences ; Mice ; Physical Examination ; Social Sciences ; Surgery ; Vision systems</subject><ispartof>PloS one, 2023-07, Vol.18 (7), p.e0288930-e0288930</ispartof><rights>Copyright: © 2023 Tanaka et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</rights><rights>COPYRIGHT 2023 Public Library of Science</rights><rights>2023 Tanaka et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2023 Tanaka et al 2023 Tanaka et al</rights><rights>2023 Tanaka et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c620t-bccedf378e1a5c4e19a452e970aa82391b591033c3e45d6b0cd36dccbcb370bf3</cites><orcidid>0000-0002-8112-0746</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10359012/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10359012/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,315,728,781,785,865,886,2929,23871,27929,27930,53796,53798</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37471381$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Srinivasan, Kathiravan</contributor><creatorcontrib>Tanaka, Yudai</creatorcontrib><creatorcontrib>Nakata, Takuto</creatorcontrib><creatorcontrib>Hibino, Hiroshi</creatorcontrib><creatorcontrib>Nishiyama, Masaaki</creatorcontrib><creatorcontrib>Ino, Daisuke</creatorcontrib><title>Classification of multiple emotional states from facial expressions in head-fixed mice using a deep learning-based image analysis</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>Facial expressions are widely recognized as universal indicators of underlying internal states in most species of animals, thereby presenting as a non-invasive measure for assessing physical and mental conditions. Despite the advancement of artificial intelligence-assisted tools for automated analysis of voluminous facial expression data in human subjects, the corresponding tools for mice still remain limited so far. Considering that mice are the most prevalent model animals for studying human health and diseases, a comprehensive characterization of emotion-dependent patterns of facial expressions in mice could extend our knowledge on the basis of emotions and the related disorders. Here, we present a framework for the development of a deep learning-powered tool for classifying facial expressions in head-fixed mouse. We demonstrate that our machine vision was capable of accurately classifying three different emotional states from lateral facial images in head-fixed mouse. Moreover, we objectively determined how our classifier characterized the differences among the facial images through the use of an interpretation technique called Gradient-weighted Class Activation Mapping. Importantly, our machine vision presumably discerned the data by leveraging multiple facial features. Our approach is likely to facilitate the non-invasive decoding of a variety of emotions from facial images in head-fixed mice.</description><subject>Analysis</subject><subject>Animal species</subject><subject>Animals</subject><subject>Artificial Intelligence</subject><subject>Biology and Life Sciences</subject><subject>Classification</subject><subject>Datasets</subject><subject>Deep Learning</subject><subject>Emotion regulation</subject><subject>Emotional factors</subject><subject>Emotions</subject><subject>Emotions - physiology</subject><subject>Face</subject><subject>Facial Expression</subject><subject>Humans</subject><subject>Image analysis</subject><subject>Image processing</subject><subject>Machine learning</subject><subject>Machine vision</subject><subject>Medicine and Health Sciences</subject><subject>Mice</subject><subject>Physical Examination</subject><subject>Social Sciences</subject><subject>Surgery</subject><subject>Vision systems</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNqNkk1v1DAQhiMEoqXwDxBYQkJwyGLHySY5oWrFR6VKlfi6WhNnvOvKsUMmQdsj_xwvm1a7qAeUQ5Lx875jj98keS74QshSvLsO0-DBLfrgccGzqqolf5Ccilpm6TLj8uHB90nyhOia80JWy-Xj5ESWeSlkJU6T3ysHRNZYDaMNngXDusmNtnfIsAu7GjhGI4xIzAyhYwa0jSXc9gNGZfDErGcbhDY1dost66xGNpH1awasReyZQxh8_E8boAjYDtbIIBrfkKWnySMDjvDZ_D5Lvn_88G31Ob28-nSxOr9MdTzBmDZaY2tkWaGAQucoasiLDOuSA1SZrEVT1IJLqSXmRbtsuG7lstW60Y0seWPkWfJy79u7QGqeHqmsynkmSl5XkXg_E1PTYavRjwM41Q9xw8ONCmDV8Yq3G7UOv1TsW9RcZNHhzewwhJ8T0qg6SxqdA49h-ttMcFFnBY_oq3_Q-7c0U2twqKw3ITbWO1N1XhZlVRT7tot7qPi0GC8j5sPYWD8SvD0SRGbE7biGiUhdfP3y_-zVj2P29QEbM-HGDQU37WJEx2C-B_UQiAY0d1MWXO3ifTsNtYu3muMdZS8Ob-hOdJtn-QchHffK</recordid><startdate>20230720</startdate><enddate>20230720</enddate><creator>Tanaka, Yudai</creator><creator>Nakata, Takuto</creator><creator>Hibino, Hiroshi</creator><creator>Nishiyama, Masaaki</creator><creator>Ino, Daisuke</creator><general>Public Library of Science</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-8112-0746</orcidid></search><sort><creationdate>20230720</creationdate><title>Classification of multiple emotional states from facial expressions in head-fixed mice using a deep learning-based image analysis</title><author>Tanaka, Yudai ; Nakata, Takuto ; Hibino, Hiroshi ; Nishiyama, Masaaki ; Ino, Daisuke</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c620t-bccedf378e1a5c4e19a452e970aa82391b591033c3e45d6b0cd36dccbcb370bf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Analysis</topic><topic>Animal species</topic><topic>Animals</topic><topic>Artificial Intelligence</topic><topic>Biology and Life Sciences</topic><topic>Classification</topic><topic>Datasets</topic><topic>Deep Learning</topic><topic>Emotion regulation</topic><topic>Emotional factors</topic><topic>Emotions</topic><topic>Emotions - physiology</topic><topic>Face</topic><topic>Facial Expression</topic><topic>Humans</topic><topic>Image analysis</topic><topic>Image processing</topic><topic>Machine learning</topic><topic>Machine vision</topic><topic>Medicine and Health Sciences</topic><topic>Mice</topic><topic>Physical Examination</topic><topic>Social Sciences</topic><topic>Surgery</topic><topic>Vision systems</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tanaka, Yudai</creatorcontrib><creatorcontrib>Nakata, Takuto</creatorcontrib><creatorcontrib>Hibino, Hiroshi</creatorcontrib><creatorcontrib>Nishiyama, Masaaki</creatorcontrib><creatorcontrib>Ino, Daisuke</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Tanaka, Yudai</au><au>Nakata, Takuto</au><au>Hibino, Hiroshi</au><au>Nishiyama, Masaaki</au><au>Ino, Daisuke</au><au>Srinivasan, Kathiravan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Classification of multiple emotional states from facial expressions in head-fixed mice using a deep learning-based image analysis</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2023-07-20</date><risdate>2023</risdate><volume>18</volume><issue>7</issue><spage>e0288930</spage><epage>e0288930</epage><pages>e0288930-e0288930</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>Facial expressions are widely recognized as universal indicators of underlying internal states in most species of animals, thereby presenting as a non-invasive measure for assessing physical and mental conditions. Despite the advancement of artificial intelligence-assisted tools for automated analysis of voluminous facial expression data in human subjects, the corresponding tools for mice still remain limited so far. Considering that mice are the most prevalent model animals for studying human health and diseases, a comprehensive characterization of emotion-dependent patterns of facial expressions in mice could extend our knowledge on the basis of emotions and the related disorders. Here, we present a framework for the development of a deep learning-powered tool for classifying facial expressions in head-fixed mouse. We demonstrate that our machine vision was capable of accurately classifying three different emotional states from lateral facial images in head-fixed mouse. Moreover, we objectively determined how our classifier characterized the differences among the facial images through the use of an interpretation technique called Gradient-weighted Class Activation Mapping. Importantly, our machine vision presumably discerned the data by leveraging multiple facial features. Our approach is likely to facilitate the non-invasive decoding of a variety of emotions from facial images in head-fixed mice.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>37471381</pmid><doi>10.1371/journal.pone.0288930</doi><tpages>e0288930</tpages><orcidid>https://orcid.org/0000-0002-8112-0746</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1932-6203
ispartof PloS one, 2023-07, Vol.18 (7), p.e0288930-e0288930
issn 1932-6203
1932-6203
language eng
recordid cdi_plos_journals_2840217098
source MEDLINE; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Public Library of Science (PLoS) Journals Open Access; PubMed Central; Free Full-Text Journals in Chemistry
subjects Analysis
Animal species
Animals
Artificial Intelligence
Biology and Life Sciences
Classification
Datasets
Deep Learning
Emotion regulation
Emotional factors
Emotions
Emotions - physiology
Face
Facial Expression
Humans
Image analysis
Image processing
Machine learning
Machine vision
Medicine and Health Sciences
Mice
Physical Examination
Social Sciences
Surgery
Vision systems
title Classification of multiple emotional states from facial expressions in head-fixed mice using a deep learning-based image analysis
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T01%3A33%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Classification%20of%20multiple%20emotional%20states%20from%20facial%20expressions%20in%20head-fixed%20mice%20using%20a%20deep%20learning-based%20image%20analysis&rft.jtitle=PloS%20one&rft.au=Tanaka,%20Yudai&rft.date=2023-07-20&rft.volume=18&rft.issue=7&rft.spage=e0288930&rft.epage=e0288930&rft.pages=e0288930-e0288930&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0288930&rft_dat=%3Cgale_plos_%3EA757855012%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2840217098&rft_id=info:pmid/37471381&rft_galeid=A757855012&rfr_iscdi=true