Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research

As early as infancy, caregivers’ facial expressions shape children's behaviors, help them regulate their emotions, and encourage or dissuade their interpersonal agency. In childhood and adolescence, proficiencies in producing and decoding facial expressions promote social competence, whereas de...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Development and psychopathology 2019-08, Vol.31 (3), p.871-886
Hauptverfasser: Haines, Nathaniel, Bell, Ziv, Crowell, Sheila, Hahn, Hunter, Kamara, Dana, McDonough-Caplan, Heather, Shader, Tiffany, Beauchaine, Theodore P.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 886
container_issue 3
container_start_page 871
container_title Development and psychopathology
container_volume 31
creator Haines, Nathaniel
Bell, Ziv
Crowell, Sheila
Hahn, Hunter
Kamara, Dana
McDonough-Caplan, Heather
Shader, Tiffany
Beauchaine, Theodore P.
description As early as infancy, caregivers’ facial expressions shape children's behaviors, help them regulate their emotions, and encourage or dissuade their interpersonal agency. In childhood and adolescence, proficiencies in producing and decoding facial expressions promote social competence, whereas deficiencies characterize several forms of psychopathology. To date, however, studying facial expressions has been hampered by the labor-intensive, time-consuming nature of human coding. We describe a partial solution: automated facial expression coding (AFEC), which combines computer vision and machine learning to code facial expressions in real time. Although AFEC cannot capture the full complexity of human emotion, it codes positive affect, negative affect, and arousal—core Research Domain Criteria constructs—as accurately as humans, and it characterizes emotion dysregulation with greater specificity than other objective measures such as autonomic responding. We provide an example in which we use AFEC to evaluate emotion dynamics in mother–daughter dyads engaged in conflict. Among other findings, AFEC (a) shows convergent validity with a validated human coding scheme, (b) distinguishes among risk groups, and (c) detects developmental increases in positive dyadic affect correspondence as teen daughters age. Although more research is needed to realize the full potential of AFEC, findings demonstrate its current utility in research on emotion dysregulation.
doi_str_mv 10.1017/S0954579419000312
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7319037</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><cupid>10_1017_S0954579419000312</cupid><sourcerecordid>2254500271</sourcerecordid><originalsourceid>FETCH-LOGICAL-c471t-c82c9c2a1d637a7bbc6062bc0291db7397fa204305503abb4027612e791f85743</originalsourceid><addsrcrecordid>eNp1kctu1DAUhi0EokPhAdggS2zYBHxJ4jELJFRxqVSJBXRtnTgnM64cO9hJRV-E58WZDuUmVpb1f-c_l5-Qp5y95IyrV5-ZbupG6Zprxpjk4h7Z8LrVleB6e59sVrla9RPyKOerwjSybh6SE8k010qLDfl-mV3YUVjmOMKMPbVxnJYZE7122cVAIfR0BLt3AalHSGHF51i4HukA1oGn-G1KmFc80zhQGAa086ESUlwy-Nf0fJy8szAfmCEmimNcP7S_yQl3iz9ItNiUHnb_mDwYwGd8cnxPyeX7d1_OPlYXnz6cn729qGyt-FzZrbDaCuB9KxWorrMta0VnmdC875TUagDBasmahknoupoJ1XKBSvNh26hanpI3t77T0o3YWwxzAm-m5EZINyaCM38qwe3NLl4bJcvNpSoGL44GKX5dMM9mdNmi9xCwrG5KEuXUNd_ygj7_C72KSwplPSNEyZGV4VaK31I2xVxOM9wNw5lZUzf_pF5qnv2-xV3Fz5gLII-mMHbJ9Tv81fv_tj8AN_C6Lw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2254500271</pqid></control><display><type>article</type><title>Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research</title><source>MEDLINE</source><source>Cambridge University Press Journals Complete</source><creator>Haines, Nathaniel ; Bell, Ziv ; Crowell, Sheila ; Hahn, Hunter ; Kamara, Dana ; McDonough-Caplan, Heather ; Shader, Tiffany ; Beauchaine, Theodore P.</creator><creatorcontrib>Haines, Nathaniel ; Bell, Ziv ; Crowell, Sheila ; Hahn, Hunter ; Kamara, Dana ; McDonough-Caplan, Heather ; Shader, Tiffany ; Beauchaine, Theodore P.</creatorcontrib><description>As early as infancy, caregivers’ facial expressions shape children's behaviors, help them regulate their emotions, and encourage or dissuade their interpersonal agency. In childhood and adolescence, proficiencies in producing and decoding facial expressions promote social competence, whereas deficiencies characterize several forms of psychopathology. To date, however, studying facial expressions has been hampered by the labor-intensive, time-consuming nature of human coding. We describe a partial solution: automated facial expression coding (AFEC), which combines computer vision and machine learning to code facial expressions in real time. Although AFEC cannot capture the full complexity of human emotion, it codes positive affect, negative affect, and arousal—core Research Domain Criteria constructs—as accurately as humans, and it characterizes emotion dysregulation with greater specificity than other objective measures such as autonomic responding. We provide an example in which we use AFEC to evaluate emotion dynamics in mother–daughter dyads engaged in conflict. Among other findings, AFEC (a) shows convergent validity with a validated human coding scheme, (b) distinguishes among risk groups, and (c) detects developmental increases in positive dyadic affect correspondence as teen daughters age. Although more research is needed to realize the full potential of AFEC, findings demonstrate its current utility in research on emotion dysregulation.</description><identifier>ISSN: 0954-5794</identifier><identifier>EISSN: 1469-2198</identifier><identifier>DOI: 10.1017/S0954579419000312</identifier><identifier>PMID: 30919792</identifier><language>eng</language><publisher>New York, USA: Cambridge University Press</publisher><subject>Adolescent ; Adolescents ; Affect - physiology ; Arousal ; Arousal - physiology ; Artificial intelligence ; Automation ; Behavior ; Child ; Child development ; Children ; Computer vision ; Emotional disorders ; Emotional regulation ; Emotions ; Emotions - physiology ; Facial Expression ; Female ; Humans ; Learning algorithms ; Machine Learning ; Male ; Mother-Child Relations ; Psychopathology ; Researchers ; Risk groups ; Software ; Special Issue Articles</subject><ispartof>Development and psychopathology, 2019-08, Vol.31 (3), p.871-886</ispartof><rights>Copyright © Cambridge University Press 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c471t-c82c9c2a1d637a7bbc6062bc0291db7397fa204305503abb4027612e791f85743</citedby><cites>FETCH-LOGICAL-c471t-c82c9c2a1d637a7bbc6062bc0291db7397fa204305503abb4027612e791f85743</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.cambridge.org/core/product/identifier/S0954579419000312/type/journal_article$$EHTML$$P50$$Gcambridge$$H</linktohtml><link.rule.ids>164,230,314,776,780,881,27903,27904,55607</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30919792$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Haines, Nathaniel</creatorcontrib><creatorcontrib>Bell, Ziv</creatorcontrib><creatorcontrib>Crowell, Sheila</creatorcontrib><creatorcontrib>Hahn, Hunter</creatorcontrib><creatorcontrib>Kamara, Dana</creatorcontrib><creatorcontrib>McDonough-Caplan, Heather</creatorcontrib><creatorcontrib>Shader, Tiffany</creatorcontrib><creatorcontrib>Beauchaine, Theodore P.</creatorcontrib><title>Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research</title><title>Development and psychopathology</title><addtitle>Dev Psychopathol</addtitle><description>As early as infancy, caregivers’ facial expressions shape children's behaviors, help them regulate their emotions, and encourage or dissuade their interpersonal agency. In childhood and adolescence, proficiencies in producing and decoding facial expressions promote social competence, whereas deficiencies characterize several forms of psychopathology. To date, however, studying facial expressions has been hampered by the labor-intensive, time-consuming nature of human coding. We describe a partial solution: automated facial expression coding (AFEC), which combines computer vision and machine learning to code facial expressions in real time. Although AFEC cannot capture the full complexity of human emotion, it codes positive affect, negative affect, and arousal—core Research Domain Criteria constructs—as accurately as humans, and it characterizes emotion dysregulation with greater specificity than other objective measures such as autonomic responding. We provide an example in which we use AFEC to evaluate emotion dynamics in mother–daughter dyads engaged in conflict. Among other findings, AFEC (a) shows convergent validity with a validated human coding scheme, (b) distinguishes among risk groups, and (c) detects developmental increases in positive dyadic affect correspondence as teen daughters age. Although more research is needed to realize the full potential of AFEC, findings demonstrate its current utility in research on emotion dysregulation.</description><subject>Adolescent</subject><subject>Adolescents</subject><subject>Affect - physiology</subject><subject>Arousal</subject><subject>Arousal - physiology</subject><subject>Artificial intelligence</subject><subject>Automation</subject><subject>Behavior</subject><subject>Child</subject><subject>Child development</subject><subject>Children</subject><subject>Computer vision</subject><subject>Emotional disorders</subject><subject>Emotional regulation</subject><subject>Emotions</subject><subject>Emotions - physiology</subject><subject>Facial Expression</subject><subject>Female</subject><subject>Humans</subject><subject>Learning algorithms</subject><subject>Machine Learning</subject><subject>Male</subject><subject>Mother-Child Relations</subject><subject>Psychopathology</subject><subject>Researchers</subject><subject>Risk groups</subject><subject>Software</subject><subject>Special Issue Articles</subject><issn>0954-5794</issn><issn>1469-2198</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNp1kctu1DAUhi0EokPhAdggS2zYBHxJ4jELJFRxqVSJBXRtnTgnM64cO9hJRV-E58WZDuUmVpb1f-c_l5-Qp5y95IyrV5-ZbupG6Zprxpjk4h7Z8LrVleB6e59sVrla9RPyKOerwjSybh6SE8k010qLDfl-mV3YUVjmOMKMPbVxnJYZE7122cVAIfR0BLt3AalHSGHF51i4HukA1oGn-G1KmFc80zhQGAa086ESUlwy-Nf0fJy8szAfmCEmimNcP7S_yQl3iz9ItNiUHnb_mDwYwGd8cnxPyeX7d1_OPlYXnz6cn729qGyt-FzZrbDaCuB9KxWorrMta0VnmdC875TUagDBasmahknoupoJ1XKBSvNh26hanpI3t77T0o3YWwxzAm-m5EZINyaCM38qwe3NLl4bJcvNpSoGL44GKX5dMM9mdNmi9xCwrG5KEuXUNd_ygj7_C72KSwplPSNEyZGV4VaK31I2xVxOM9wNw5lZUzf_pF5qnv2-xV3Fz5gLII-mMHbJ9Tv81fv_tj8AN_C6Lw</recordid><startdate>20190801</startdate><enddate>20190801</enddate><creator>Haines, Nathaniel</creator><creator>Bell, Ziv</creator><creator>Crowell, Sheila</creator><creator>Hahn, Hunter</creator><creator>Kamara, Dana</creator><creator>McDonough-Caplan, Heather</creator><creator>Shader, Tiffany</creator><creator>Beauchaine, Theodore P.</creator><general>Cambridge University Press</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>0-V</scope><scope>3V.</scope><scope>7TK</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>88G</scope><scope>8AM</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ALSLI</scope><scope>AN0</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGRYB</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>K7.</scope><scope>K9.</scope><scope>M0O</scope><scope>M0S</scope><scope>M1P</scope><scope>M2M</scope><scope>M2O</scope><scope>MBDVC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope></search><sort><creationdate>20190801</creationdate><title>Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research</title><author>Haines, Nathaniel ; Bell, Ziv ; Crowell, Sheila ; Hahn, Hunter ; Kamara, Dana ; McDonough-Caplan, Heather ; Shader, Tiffany ; Beauchaine, Theodore P.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c471t-c82c9c2a1d637a7bbc6062bc0291db7397fa204305503abb4027612e791f85743</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Adolescent</topic><topic>Adolescents</topic><topic>Affect - physiology</topic><topic>Arousal</topic><topic>Arousal - physiology</topic><topic>Artificial intelligence</topic><topic>Automation</topic><topic>Behavior</topic><topic>Child</topic><topic>Child development</topic><topic>Children</topic><topic>Computer vision</topic><topic>Emotional disorders</topic><topic>Emotional regulation</topic><topic>Emotions</topic><topic>Emotions - physiology</topic><topic>Facial Expression</topic><topic>Female</topic><topic>Humans</topic><topic>Learning algorithms</topic><topic>Machine Learning</topic><topic>Male</topic><topic>Mother-Child Relations</topic><topic>Psychopathology</topic><topic>Researchers</topic><topic>Risk groups</topic><topic>Software</topic><topic>Special Issue Articles</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Haines, Nathaniel</creatorcontrib><creatorcontrib>Bell, Ziv</creatorcontrib><creatorcontrib>Crowell, Sheila</creatorcontrib><creatorcontrib>Hahn, Hunter</creatorcontrib><creatorcontrib>Kamara, Dana</creatorcontrib><creatorcontrib>McDonough-Caplan, Heather</creatorcontrib><creatorcontrib>Shader, Tiffany</creatorcontrib><creatorcontrib>Beauchaine, Theodore P.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Social Sciences Premium Collection</collection><collection>ProQuest Central (Corporate)</collection><collection>Neurosciences Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Psychology Database (Alumni)</collection><collection>Criminal Justice Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Social Science Premium Collection</collection><collection>British Nursing Database</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Criminology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>ProQuest Criminal Justice (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Criminal Justice</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest Psychology</collection><collection>Research Library</collection><collection>Research Library (Corporate)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Development and psychopathology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Haines, Nathaniel</au><au>Bell, Ziv</au><au>Crowell, Sheila</au><au>Hahn, Hunter</au><au>Kamara, Dana</au><au>McDonough-Caplan, Heather</au><au>Shader, Tiffany</au><au>Beauchaine, Theodore P.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research</atitle><jtitle>Development and psychopathology</jtitle><addtitle>Dev Psychopathol</addtitle><date>2019-08-01</date><risdate>2019</risdate><volume>31</volume><issue>3</issue><spage>871</spage><epage>886</epage><pages>871-886</pages><issn>0954-5794</issn><eissn>1469-2198</eissn><abstract>As early as infancy, caregivers’ facial expressions shape children's behaviors, help them regulate their emotions, and encourage or dissuade their interpersonal agency. In childhood and adolescence, proficiencies in producing and decoding facial expressions promote social competence, whereas deficiencies characterize several forms of psychopathology. To date, however, studying facial expressions has been hampered by the labor-intensive, time-consuming nature of human coding. We describe a partial solution: automated facial expression coding (AFEC), which combines computer vision and machine learning to code facial expressions in real time. Although AFEC cannot capture the full complexity of human emotion, it codes positive affect, negative affect, and arousal—core Research Domain Criteria constructs—as accurately as humans, and it characterizes emotion dysregulation with greater specificity than other objective measures such as autonomic responding. We provide an example in which we use AFEC to evaluate emotion dynamics in mother–daughter dyads engaged in conflict. Among other findings, AFEC (a) shows convergent validity with a validated human coding scheme, (b) distinguishes among risk groups, and (c) detects developmental increases in positive dyadic affect correspondence as teen daughters age. Although more research is needed to realize the full potential of AFEC, findings demonstrate its current utility in research on emotion dysregulation.</abstract><cop>New York, USA</cop><pub>Cambridge University Press</pub><pmid>30919792</pmid><doi>10.1017/S0954579419000312</doi><tpages>16</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0954-5794
ispartof Development and psychopathology, 2019-08, Vol.31 (3), p.871-886
issn 0954-5794
1469-2198
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7319037
source MEDLINE; Cambridge University Press Journals Complete
subjects Adolescent
Adolescents
Affect - physiology
Arousal
Arousal - physiology
Artificial intelligence
Automation
Behavior
Child
Child development
Children
Computer vision
Emotional disorders
Emotional regulation
Emotions
Emotions - physiology
Facial Expression
Female
Humans
Learning algorithms
Machine Learning
Male
Mother-Child Relations
Psychopathology
Researchers
Risk groups
Software
Special Issue Articles
title Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T19%3A44%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Using%20automated%20computer%20vision%20and%20machine%20learning%20to%20code%20facial%20expressions%20of%20affect%20and%20arousal:%20Implications%20for%20emotion%20dysregulation%20research&rft.jtitle=Development%20and%20psychopathology&rft.au=Haines,%20Nathaniel&rft.date=2019-08-01&rft.volume=31&rft.issue=3&rft.spage=871&rft.epage=886&rft.pages=871-886&rft.issn=0954-5794&rft.eissn=1469-2198&rft_id=info:doi/10.1017/S0954579419000312&rft_dat=%3Cproquest_pubme%3E2254500271%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2254500271&rft_id=info:pmid/30919792&rft_cupid=10_1017_S0954579419000312&rfr_iscdi=true