The motion in emotion - A CERT based approach to the FERA emotion challenge

This paper assesses the performance of measures of facial expression dynamics derived from the Computer Expression Recognition Toolbox (CERT) for classifying emotions in the Facial Expression Recognition and Analysis (FERA) Challenge. The CERT system automatically estimates facial action intensity a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Littlewort, G, Whitehill, J, Ting-Fan Wu, Butko, N, Ruvolo, P, Movellan, J, Bartlett, M
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 902
container_issue
container_start_page 897
container_title
container_volume
creator Littlewort, G
Whitehill, J
Ting-Fan Wu
Butko, N
Ruvolo, P
Movellan, J
Bartlett, M
description This paper assesses the performance of measures of facial expression dynamics derived from the Computer Expression Recognition Toolbox (CERT) for classifying emotions in the Facial Expression Recognition and Analysis (FERA) Challenge. The CERT system automatically estimates facial action intensity and head position using learned appearance-based models on single frames of video. CERT outputs were used to derive a representation of the intensity and motion in each video, consisting of the extremes of displacement, velocity and acceleration. Using this representation, emotion detectors were trained on the FERA training examples. Experiments on the released portion of the FERA dataset are presented, as well as results on the blind test. No consideration of subject identity was taken into account in the blind test. The F1 scores were well above the baseline criterion for success.
doi_str_mv 10.1109/FG.2011.5771370
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5771370</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5771370</ieee_id><sourcerecordid>5771370</sourcerecordid><originalsourceid>FETCH-LOGICAL-i90t-389a2412a69210c043b32469c473012005c98e861258ea3d7468e7bb55e61b243</originalsourceid><addsrcrecordid>eNo9j09Lw0AUxFdEUGvPHrzsF0h8b__vMYSkFQtCyb3spk8TSZOQ5OK3VzD0NDPwm4Fh7BkhRQT_Wu5SAYipthalhRv2iEoo5VF6d3sNCvCebef5GwAQpJbGPLD3qiF-GZZ26Hnbc1ptwjOeF8eKxzDTmYdxnIZQN3wZ-PJXKItjdmXrJnQd9V_0xO4-QzfTdtUNq8qiyvfJ4WP3lmeHpPWwJNL5IBSKYLxAqEHJKIUyvlZWAgoAXXtHzqDQjoI8W2Uc2Ri1JoNRKLlhL_-zLRGdxqm9hOnntH6Xv-01SQA</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>The motion in emotion - A CERT based approach to the FERA emotion challenge</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Littlewort, G ; Whitehill, J ; Ting-Fan Wu ; Butko, N ; Ruvolo, P ; Movellan, J ; Bartlett, M</creator><creatorcontrib>Littlewort, G ; Whitehill, J ; Ting-Fan Wu ; Butko, N ; Ruvolo, P ; Movellan, J ; Bartlett, M</creatorcontrib><description>This paper assesses the performance of measures of facial expression dynamics derived from the Computer Expression Recognition Toolbox (CERT) for classifying emotions in the Facial Expression Recognition and Analysis (FERA) Challenge. The CERT system automatically estimates facial action intensity and head position using learned appearance-based models on single frames of video. CERT outputs were used to derive a representation of the intensity and motion in each video, consisting of the extremes of displacement, velocity and acceleration. Using this representation, emotion detectors were trained on the FERA training examples. Experiments on the released portion of the FERA dataset are presented, as well as results on the blind test. No consideration of subject identity was taken into account in the blind test. The F1 scores were well above the baseline criterion for success.</description><identifier>ISBN: 1424491401</identifier><identifier>ISBN: 9781424491407</identifier><identifier>EISBN: 1424491398</identifier><identifier>EISBN: 9781424491414</identifier><identifier>EISBN: 9781424491391</identifier><identifier>EISBN: 142449141X</identifier><identifier>DOI: 10.1109/FG.2011.5771370</identifier><language>eng</language><publisher>IEEE</publisher><subject>Acceleration ; Detectors ; Emotion recognition ; Face recognition ; Head ; Training</subject><ispartof>Face and Gesture 2011, 2011, p.897-902</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5771370$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>310,311,781,785,790,791,2059,27927,54922</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5771370$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Littlewort, G</creatorcontrib><creatorcontrib>Whitehill, J</creatorcontrib><creatorcontrib>Ting-Fan Wu</creatorcontrib><creatorcontrib>Butko, N</creatorcontrib><creatorcontrib>Ruvolo, P</creatorcontrib><creatorcontrib>Movellan, J</creatorcontrib><creatorcontrib>Bartlett, M</creatorcontrib><title>The motion in emotion - A CERT based approach to the FERA emotion challenge</title><title>Face and Gesture 2011</title><addtitle>FG</addtitle><description>This paper assesses the performance of measures of facial expression dynamics derived from the Computer Expression Recognition Toolbox (CERT) for classifying emotions in the Facial Expression Recognition and Analysis (FERA) Challenge. The CERT system automatically estimates facial action intensity and head position using learned appearance-based models on single frames of video. CERT outputs were used to derive a representation of the intensity and motion in each video, consisting of the extremes of displacement, velocity and acceleration. Using this representation, emotion detectors were trained on the FERA training examples. Experiments on the released portion of the FERA dataset are presented, as well as results on the blind test. No consideration of subject identity was taken into account in the blind test. The F1 scores were well above the baseline criterion for success.</description><subject>Acceleration</subject><subject>Detectors</subject><subject>Emotion recognition</subject><subject>Face recognition</subject><subject>Head</subject><subject>Training</subject><isbn>1424491401</isbn><isbn>9781424491407</isbn><isbn>1424491398</isbn><isbn>9781424491414</isbn><isbn>9781424491391</isbn><isbn>142449141X</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2011</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNo9j09Lw0AUxFdEUGvPHrzsF0h8b__vMYSkFQtCyb3spk8TSZOQ5OK3VzD0NDPwm4Fh7BkhRQT_Wu5SAYipthalhRv2iEoo5VF6d3sNCvCebef5GwAQpJbGPLD3qiF-GZZ26Hnbc1ptwjOeF8eKxzDTmYdxnIZQN3wZ-PJXKItjdmXrJnQd9V_0xO4-QzfTdtUNq8qiyvfJ4WP3lmeHpPWwJNL5IBSKYLxAqEHJKIUyvlZWAgoAXXtHzqDQjoI8W2Uc2Ri1JoNRKLlhL_-zLRGdxqm9hOnntH6Xv-01SQA</recordid><startdate>201103</startdate><enddate>201103</enddate><creator>Littlewort, G</creator><creator>Whitehill, J</creator><creator>Ting-Fan Wu</creator><creator>Butko, N</creator><creator>Ruvolo, P</creator><creator>Movellan, J</creator><creator>Bartlett, M</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201103</creationdate><title>The motion in emotion - A CERT based approach to the FERA emotion challenge</title><author>Littlewort, G ; Whitehill, J ; Ting-Fan Wu ; Butko, N ; Ruvolo, P ; Movellan, J ; Bartlett, M</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i90t-389a2412a69210c043b32469c473012005c98e861258ea3d7468e7bb55e61b243</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Acceleration</topic><topic>Detectors</topic><topic>Emotion recognition</topic><topic>Face recognition</topic><topic>Head</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Littlewort, G</creatorcontrib><creatorcontrib>Whitehill, J</creatorcontrib><creatorcontrib>Ting-Fan Wu</creatorcontrib><creatorcontrib>Butko, N</creatorcontrib><creatorcontrib>Ruvolo, P</creatorcontrib><creatorcontrib>Movellan, J</creatorcontrib><creatorcontrib>Bartlett, M</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Littlewort, G</au><au>Whitehill, J</au><au>Ting-Fan Wu</au><au>Butko, N</au><au>Ruvolo, P</au><au>Movellan, J</au><au>Bartlett, M</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>The motion in emotion - A CERT based approach to the FERA emotion challenge</atitle><btitle>Face and Gesture 2011</btitle><stitle>FG</stitle><date>2011-03</date><risdate>2011</risdate><spage>897</spage><epage>902</epage><pages>897-902</pages><isbn>1424491401</isbn><isbn>9781424491407</isbn><eisbn>1424491398</eisbn><eisbn>9781424491414</eisbn><eisbn>9781424491391</eisbn><eisbn>142449141X</eisbn><abstract>This paper assesses the performance of measures of facial expression dynamics derived from the Computer Expression Recognition Toolbox (CERT) for classifying emotions in the Facial Expression Recognition and Analysis (FERA) Challenge. The CERT system automatically estimates facial action intensity and head position using learned appearance-based models on single frames of video. CERT outputs were used to derive a representation of the intensity and motion in each video, consisting of the extremes of displacement, velocity and acceleration. Using this representation, emotion detectors were trained on the FERA training examples. Experiments on the released portion of the FERA dataset are presented, as well as results on the blind test. No consideration of subject identity was taken into account in the blind test. The F1 scores were well above the baseline criterion for success.</abstract><pub>IEEE</pub><doi>10.1109/FG.2011.5771370</doi><tpages>6</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 1424491401
ispartof Face and Gesture 2011, 2011, p.897-902
issn
language eng
recordid cdi_ieee_primary_5771370
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Acceleration
Detectors
Emotion recognition
Face recognition
Head
Training
title The motion in emotion - A CERT based approach to the FERA emotion challenge
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-18T08%3A39%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=The%20motion%20in%20emotion%20-%20A%20CERT%20based%20approach%20to%20the%20FERA%20emotion%20challenge&rft.btitle=Face%20and%20Gesture%202011&rft.au=Littlewort,%20G&rft.date=2011-03&rft.spage=897&rft.epage=902&rft.pages=897-902&rft.isbn=1424491401&rft.isbn_list=9781424491407&rft_id=info:doi/10.1109/FG.2011.5771370&rft_dat=%3Cieee_6IE%3E5771370%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=1424491398&rft.eisbn_list=9781424491414&rft.eisbn_list=9781424491391&rft.eisbn_list=142449141X&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5771370&rfr_iscdi=true