Cost-effective real-time recognition for human emotion-age-gender using deep learning with normalized facial cropping preprocess
Because of technological advancement, human face recognition has been commonly applied in various fields. There are some HCI-related applications, such as camera-ready chatbot and companion robot, require gathering more information from user’s face. In this paper, we developed a system called EAGR f...
Gespeichert in:
Veröffentlicht in: | Multimedia tools and applications 2021-05, Vol.80 (13), p.19845-19866 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 19866 |
---|---|
container_issue | 13 |
container_start_page | 19845 |
container_title | Multimedia tools and applications |
container_volume | 80 |
creator | Lu, Ta-Te Yeh, Sheng-Cheng Wang, Chia-Hui Wei, Min-Rou |
description | Because of technological advancement, human face recognition has been commonly applied in various fields. There are some HCI-related applications, such as camera-ready chatbot and companion robot, require gathering more information from user’s face. In this paper, we developed a system called EAGR for emotion, age, and gender recognition, which can perceive user’s emotion, age and gender based on the face detection. The EAGR system first applies normalized facial cropping (NFC) as a preprocessing method for training data before data augmentation, then uses convolution neural network (CNN) as three training models for recognizing seven emotions (six basics plus one neutral emotion), four age groups, and two genders. For better emotion recognition, the NFC will extract facial features without hair retained. On the other hand, the NFC will extract facial features with hair retained for better age and gender recognition. The experiments were conducted on these three training models of emotion, age and gender recognitions. The recognition performance results from the testing dataset, which has been normalized for tilted head by proposed binocular line angle correction (BLAC), showed that the optimal mean accuracy rates of real-time recognition for seven emotions, four age groups and two genders were 82.4%, 74.95%, and 96.65% respectively. Furthermore, the training time can be substantially reduced via NFC preprocessing. Therefore, we believe that EAGR system is cost-effective in recognizing human emotions, ages, and genders. The EAGR system can be further applied in social applications to help HCI service provide more accurate feedback from pluralistic facial classifications. |
doi_str_mv | 10.1007/s11042-021-10673-x |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2530267175</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2530267175</sourcerecordid><originalsourceid>FETCH-LOGICAL-c367t-e02bc0e17742a06ced7b6b4644ed2aea836c444bf2bc6e5cb7a9018b5586b91f3</originalsourceid><addsrcrecordid>eNp9UE1LxDAQLaLguvoHPAU8R5M0TbpHWfwCwYueQ5pOu1napCatrp786aau4M3TvJl5783wsuyckktKiLyKlBLOMGEUUyJkjncH2YIWCUjJ6GHCeUmwLAg9zk5i3BJCRcH4Ivta-zhiaBowo30DFEB3eLT9jIxvnR2td6jxAW2mXjsEvZ8nWLeAW3A1BDRF61pUAwyoAx3c3L3bcYOcD73u7CfUqNHG6g6Z4Idh3g8BhuANxHiaHTW6i3D2W5fZy-3N8_oePz7dPayvH7HJhUwfElYZAlRKzjQRBmpZiYoLzqFmGnSZC8M5r5pEE1CYSuoVoWVVFKWoVrTJl9nF3jfdfZ0gjmrrp-DSScWKnDAhqSwSi-1Z6dMYAzRqCLbX4UNRouak1T5plZJWP0mrXRLle1FMZNdC-LP-R_UN4e6E3A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2530267175</pqid></control><display><type>article</type><title>Cost-effective real-time recognition for human emotion-age-gender using deep learning with normalized facial cropping preprocess</title><source>SpringerLink Journals - AutoHoldings</source><creator>Lu, Ta-Te ; Yeh, Sheng-Cheng ; Wang, Chia-Hui ; Wei, Min-Rou</creator><creatorcontrib>Lu, Ta-Te ; Yeh, Sheng-Cheng ; Wang, Chia-Hui ; Wei, Min-Rou</creatorcontrib><description>Because of technological advancement, human face recognition has been commonly applied in various fields. There are some HCI-related applications, such as camera-ready chatbot and companion robot, require gathering more information from user’s face. In this paper, we developed a system called EAGR for emotion, age, and gender recognition, which can perceive user’s emotion, age and gender based on the face detection. The EAGR system first applies normalized facial cropping (NFC) as a preprocessing method for training data before data augmentation, then uses convolution neural network (CNN) as three training models for recognizing seven emotions (six basics plus one neutral emotion), four age groups, and two genders. For better emotion recognition, the NFC will extract facial features without hair retained. On the other hand, the NFC will extract facial features with hair retained for better age and gender recognition. The experiments were conducted on these three training models of emotion, age and gender recognitions. The recognition performance results from the testing dataset, which has been normalized for tilted head by proposed binocular line angle correction (BLAC), showed that the optimal mean accuracy rates of real-time recognition for seven emotions, four age groups and two genders were 82.4%, 74.95%, and 96.65% respectively. Furthermore, the training time can be substantially reduced via NFC preprocessing. Therefore, we believe that EAGR system is cost-effective in recognizing human emotions, ages, and genders. The EAGR system can be further applied in social applications to help HCI service provide more accurate feedback from pluralistic facial classifications.</description><identifier>ISSN: 1380-7501</identifier><identifier>EISSN: 1573-7721</identifier><identifier>DOI: 10.1007/s11042-021-10673-x</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Age ; Age groups ; Artificial neural networks ; Computer Communication Networks ; Computer Science ; Convolution ; Data Structures and Information Theory ; Deep learning ; Emotion recognition ; Emotions ; Face recognition ; Feature extraction ; Gender ; Multimedia Information Systems ; Preprocessing ; Real time ; Special Purpose and Application-Based Systems ; Training</subject><ispartof>Multimedia tools and applications, 2021-05, Vol.80 (13), p.19845-19866</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021</rights><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature 2021.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c367t-e02bc0e17742a06ced7b6b4644ed2aea836c444bf2bc6e5cb7a9018b5586b91f3</citedby><cites>FETCH-LOGICAL-c367t-e02bc0e17742a06ced7b6b4644ed2aea836c444bf2bc6e5cb7a9018b5586b91f3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11042-021-10673-x$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11042-021-10673-x$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Lu, Ta-Te</creatorcontrib><creatorcontrib>Yeh, Sheng-Cheng</creatorcontrib><creatorcontrib>Wang, Chia-Hui</creatorcontrib><creatorcontrib>Wei, Min-Rou</creatorcontrib><title>Cost-effective real-time recognition for human emotion-age-gender using deep learning with normalized facial cropping preprocess</title><title>Multimedia tools and applications</title><addtitle>Multimed Tools Appl</addtitle><description>Because of technological advancement, human face recognition has been commonly applied in various fields. There are some HCI-related applications, such as camera-ready chatbot and companion robot, require gathering more information from user’s face. In this paper, we developed a system called EAGR for emotion, age, and gender recognition, which can perceive user’s emotion, age and gender based on the face detection. The EAGR system first applies normalized facial cropping (NFC) as a preprocessing method for training data before data augmentation, then uses convolution neural network (CNN) as three training models for recognizing seven emotions (six basics plus one neutral emotion), four age groups, and two genders. For better emotion recognition, the NFC will extract facial features without hair retained. On the other hand, the NFC will extract facial features with hair retained for better age and gender recognition. The experiments were conducted on these three training models of emotion, age and gender recognitions. The recognition performance results from the testing dataset, which has been normalized for tilted head by proposed binocular line angle correction (BLAC), showed that the optimal mean accuracy rates of real-time recognition for seven emotions, four age groups and two genders were 82.4%, 74.95%, and 96.65% respectively. Furthermore, the training time can be substantially reduced via NFC preprocessing. Therefore, we believe that EAGR system is cost-effective in recognizing human emotions, ages, and genders. The EAGR system can be further applied in social applications to help HCI service provide more accurate feedback from pluralistic facial classifications.</description><subject>Age</subject><subject>Age groups</subject><subject>Artificial neural networks</subject><subject>Computer Communication Networks</subject><subject>Computer Science</subject><subject>Convolution</subject><subject>Data Structures and Information Theory</subject><subject>Deep learning</subject><subject>Emotion recognition</subject><subject>Emotions</subject><subject>Face recognition</subject><subject>Feature extraction</subject><subject>Gender</subject><subject>Multimedia Information Systems</subject><subject>Preprocessing</subject><subject>Real time</subject><subject>Special Purpose and Application-Based Systems</subject><subject>Training</subject><issn>1380-7501</issn><issn>1573-7721</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNp9UE1LxDAQLaLguvoHPAU8R5M0TbpHWfwCwYueQ5pOu1napCatrp786aau4M3TvJl5783wsuyckktKiLyKlBLOMGEUUyJkjncH2YIWCUjJ6GHCeUmwLAg9zk5i3BJCRcH4Ivta-zhiaBowo30DFEB3eLT9jIxvnR2td6jxAW2mXjsEvZ8nWLeAW3A1BDRF61pUAwyoAx3c3L3bcYOcD73u7CfUqNHG6g6Z4Idh3g8BhuANxHiaHTW6i3D2W5fZy-3N8_oePz7dPayvH7HJhUwfElYZAlRKzjQRBmpZiYoLzqFmGnSZC8M5r5pEE1CYSuoVoWVVFKWoVrTJl9nF3jfdfZ0gjmrrp-DSScWKnDAhqSwSi-1Z6dMYAzRqCLbX4UNRouak1T5plZJWP0mrXRLle1FMZNdC-LP-R_UN4e6E3A</recordid><startdate>20210501</startdate><enddate>20210501</enddate><creator>Lu, Ta-Te</creator><creator>Yeh, Sheng-Cheng</creator><creator>Wang, Chia-Hui</creator><creator>Wei, Min-Rou</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M2O</scope><scope>MBDVC</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope></search><sort><creationdate>20210501</creationdate><title>Cost-effective real-time recognition for human emotion-age-gender using deep learning with normalized facial cropping preprocess</title><author>Lu, Ta-Te ; Yeh, Sheng-Cheng ; Wang, Chia-Hui ; Wei, Min-Rou</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c367t-e02bc0e17742a06ced7b6b4644ed2aea836c444bf2bc6e5cb7a9018b5586b91f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Age</topic><topic>Age groups</topic><topic>Artificial neural networks</topic><topic>Computer Communication Networks</topic><topic>Computer Science</topic><topic>Convolution</topic><topic>Data Structures and Information Theory</topic><topic>Deep learning</topic><topic>Emotion recognition</topic><topic>Emotions</topic><topic>Face recognition</topic><topic>Feature extraction</topic><topic>Gender</topic><topic>Multimedia Information Systems</topic><topic>Preprocessing</topic><topic>Real time</topic><topic>Special Purpose and Application-Based Systems</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lu, Ta-Te</creatorcontrib><creatorcontrib>Yeh, Sheng-Cheng</creatorcontrib><creatorcontrib>Wang, Chia-Hui</creatorcontrib><creatorcontrib>Wei, Min-Rou</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Research Library</collection><collection>Research Library (Corporate)</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><jtitle>Multimedia tools and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lu, Ta-Te</au><au>Yeh, Sheng-Cheng</au><au>Wang, Chia-Hui</au><au>Wei, Min-Rou</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Cost-effective real-time recognition for human emotion-age-gender using deep learning with normalized facial cropping preprocess</atitle><jtitle>Multimedia tools and applications</jtitle><stitle>Multimed Tools Appl</stitle><date>2021-05-01</date><risdate>2021</risdate><volume>80</volume><issue>13</issue><spage>19845</spage><epage>19866</epage><pages>19845-19866</pages><issn>1380-7501</issn><eissn>1573-7721</eissn><abstract>Because of technological advancement, human face recognition has been commonly applied in various fields. There are some HCI-related applications, such as camera-ready chatbot and companion robot, require gathering more information from user’s face. In this paper, we developed a system called EAGR for emotion, age, and gender recognition, which can perceive user’s emotion, age and gender based on the face detection. The EAGR system first applies normalized facial cropping (NFC) as a preprocessing method for training data before data augmentation, then uses convolution neural network (CNN) as three training models for recognizing seven emotions (six basics plus one neutral emotion), four age groups, and two genders. For better emotion recognition, the NFC will extract facial features without hair retained. On the other hand, the NFC will extract facial features with hair retained for better age and gender recognition. The experiments were conducted on these three training models of emotion, age and gender recognitions. The recognition performance results from the testing dataset, which has been normalized for tilted head by proposed binocular line angle correction (BLAC), showed that the optimal mean accuracy rates of real-time recognition for seven emotions, four age groups and two genders were 82.4%, 74.95%, and 96.65% respectively. Furthermore, the training time can be substantially reduced via NFC preprocessing. Therefore, we believe that EAGR system is cost-effective in recognizing human emotions, ages, and genders. The EAGR system can be further applied in social applications to help HCI service provide more accurate feedback from pluralistic facial classifications.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s11042-021-10673-x</doi><tpages>22</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1380-7501 |
ispartof | Multimedia tools and applications, 2021-05, Vol.80 (13), p.19845-19866 |
issn | 1380-7501 1573-7721 |
language | eng |
recordid | cdi_proquest_journals_2530267175 |
source | SpringerLink Journals - AutoHoldings |
subjects | Age Age groups Artificial neural networks Computer Communication Networks Computer Science Convolution Data Structures and Information Theory Deep learning Emotion recognition Emotions Face recognition Feature extraction Gender Multimedia Information Systems Preprocessing Real time Special Purpose and Application-Based Systems Training |
title | Cost-effective real-time recognition for human emotion-age-gender using deep learning with normalized facial cropping preprocess |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T16%3A31%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Cost-effective%20real-time%20recognition%20for%20human%20emotion-age-gender%20using%20deep%20learning%20with%20normalized%20facial%20cropping%20preprocess&rft.jtitle=Multimedia%20tools%20and%20applications&rft.au=Lu,%20Ta-Te&rft.date=2021-05-01&rft.volume=80&rft.issue=13&rft.spage=19845&rft.epage=19866&rft.pages=19845-19866&rft.issn=1380-7501&rft.eissn=1573-7721&rft_id=info:doi/10.1007/s11042-021-10673-x&rft_dat=%3Cproquest_cross%3E2530267175%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2530267175&rft_id=info:pmid/&rfr_iscdi=true |