Neural correlates of facial expression processing during a detection task: An ERP study
Given finite attentional resources, how emotional aspects of stimuli are processed automatically is controversial. Present study examined the time-course for automatic processing of facial expression by assessing N170, and late positive potentials (LPPs) of event-related potentials (ERPs) using a mo...
Gespeichert in:
Veröffentlicht in: | PloS one 2017-03, Vol.12 (3), p.e0174016-e0174016 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | e0174016 |
---|---|
container_issue | 3 |
container_start_page | e0174016 |
container_title | PloS one |
container_volume | 12 |
creator | Sun, Luxi Ren, Jie He, Weijie |
description | Given finite attentional resources, how emotional aspects of stimuli are processed automatically is controversial. Present study examined the time-course for automatic processing of facial expression by assessing N170, and late positive potentials (LPPs) of event-related potentials (ERPs) using a modified rapid serial visual presentation (RSVP) paradigm. Observers were required to confirm a certain house image and to detect whether a face image was presented at the end of a series of pictures. There were no significant main effects on emotional type for P1 amplitudes, whereas happy and fearful expressions elicited larger N170 amplitudes than neutral expressions. Significantly different LPP amplitudes were elicited depending on the type of emotional facial expressions (fear > happy > neutral). These results indicated that threatening priority was absent but discrimination of expressive vs. neutral faces occurred in implicit emotional tasks, at approximately 250 ms post-stimulus. Moreover, the three types of expressions were discriminated during the later stages of processing. Encoding emotional information of faces can be automated to a relatively higher degree, when attentional resources are mostly allocated to superficial analyzing. |
doi_str_mv | 10.1371/journal.pone.0174016 |
format | Article |
fullrecord | <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_1882248182</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A487530203</galeid><doaj_id>oai_doaj_org_article_f138a4f67bf24a1da6e27852e91a5a62</doaj_id><sourcerecordid>A487530203</sourcerecordid><originalsourceid>FETCH-LOGICAL-c692t-313af52fb643a2f88934e636675b2242fd782bb609b37d65d4973d49b58a80673</originalsourceid><addsrcrecordid>eNqNkl2L1DAUhoso7rr6D0QLgujFjPlok9QLYVhWHVhcWb8uw2l7MtOx04xJKrv_3tTpLlPZCyk04eQ570nOeZPkKSVzyiV9s7G966Cd72yHc0JlRqi4lxzTgrOZYITfP9gfJY-83xCScyXEw-SIKZ4TRchx8uMT9g7atLLOYQsBfWpNaqBqYhCvdg69b2yX7pythm23SuveDQukNQaswnAawP98my669Ozyc-pDX18_Th4YaD0-GdeT5Nv7s6-nH2fnFx-Wp4vzWSUKFmaccjA5M6XIODCjVMEzFFwImZeMZczUUrGyFKQouaxFXmeF5PFX5goUEZKfJM_3urvWej32xGuqVExXVLFILPdEbWGjd67ZgrvWFhr9N2DdSoMLTdWiNpQryIyQpWEZ0BoEMqlyhgWFHMSg9W6s1pdbrCvsQmzeRHR60jVrvbK_dc5l7H4RBV6NAs7-6tEHvW18hW0LHdp-f-9hMGyo9eIf9O7XjdQK4gOazthYtxpE9SJTMuckjj9S8zuo-NW4bapoINPE-CTh9SQhMgGvwgp67_Xyy-X_sxffp-zLA3aN0Ia1t20_uMhPwWwPVs5679DcNpkSPfj_pht68L8e_R_Tnh0O6DbpxvD8DxB3_eo</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1882248182</pqid></control><display><type>article</type><title>Neural correlates of facial expression processing during a detection task: An ERP study</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Public Library of Science (PLoS) Journals Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><creator>Sun, Luxi ; Ren, Jie ; He, Weijie</creator><contributor>Luo, Wenbo</contributor><creatorcontrib>Sun, Luxi ; Ren, Jie ; He, Weijie ; Luo, Wenbo</creatorcontrib><description>Given finite attentional resources, how emotional aspects of stimuli are processed automatically is controversial. Present study examined the time-course for automatic processing of facial expression by assessing N170, and late positive potentials (LPPs) of event-related potentials (ERPs) using a modified rapid serial visual presentation (RSVP) paradigm. Observers were required to confirm a certain house image and to detect whether a face image was presented at the end of a series of pictures. There were no significant main effects on emotional type for P1 amplitudes, whereas happy and fearful expressions elicited larger N170 amplitudes than neutral expressions. Significantly different LPP amplitudes were elicited depending on the type of emotional facial expressions (fear > happy > neutral). These results indicated that threatening priority was absent but discrimination of expressive vs. neutral faces occurred in implicit emotional tasks, at approximately 250 ms post-stimulus. Moreover, the three types of expressions were discriminated during the later stages of processing. Encoding emotional information of faces can be automated to a relatively higher degree, when attentional resources are mostly allocated to superficial analyzing.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0174016</identifier><identifier>PMID: 28350800</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Activation ; Amplitudes ; Analysis ; Analysis of Variance ; Anxiety ; Attention ; Auditory evoked potentials ; Automation ; Biology and Life Sciences ; Brain ; Business administration ; Classification ; Cognition ; Cognitive ability ; Colleges & universities ; Computer and Information Sciences ; Cortex (occipital) ; Cortex (temporal) ; Decoding ; Discrimination ; EEG ; Electroencephalography - methods ; Electrooculography - methods ; Electrophysiology ; Emotional behavior ; Emotions ; Emotions - physiology ; Engineering ; Event-related potentials ; Evoked potentials ; Evoked Potentials - physiology ; Extracellular signal-regulated kinase ; Face ; Facial Expression ; Fear ; Female ; Happiness ; Hemispheric laterality ; Human behavior ; Humans ; Laboratories ; Magnetoencephalography ; Male ; Medical imaging ; Medicine and Health Sciences ; Mental disorders ; Mental health ; Nervous system ; Neuroimaging ; Neurology ; Neurosciences ; Perception ; Perceptions ; Photic Stimulation ; Prefrontal cortex ; Priming ; Psychology ; Psychomotor Performance - physiology ; Rapid serial visual presentation ; Reaction Time - physiology ; Research and Analysis Methods ; Separation ; Social Sciences ; Temporal lobe ; Visual cortex ; Visual discrimination ; Visual perception ; Young Adult</subject><ispartof>PloS one, 2017-03, Vol.12 (3), p.e0174016-e0174016</ispartof><rights>COPYRIGHT 2017 Public Library of Science</rights><rights>2017 Sun et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2017 Sun et al 2017 Sun et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c692t-313af52fb643a2f88934e636675b2242fd782bb609b37d65d4973d49b58a80673</citedby><cites>FETCH-LOGICAL-c692t-313af52fb643a2f88934e636675b2242fd782bb609b37d65d4973d49b58a80673</cites><orcidid>0000-0002-0385-8948</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC5370059/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC5370059/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,2102,2928,23866,27924,27925,53791,53793,79600,79601</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/28350800$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Luo, Wenbo</contributor><creatorcontrib>Sun, Luxi</creatorcontrib><creatorcontrib>Ren, Jie</creatorcontrib><creatorcontrib>He, Weijie</creatorcontrib><title>Neural correlates of facial expression processing during a detection task: An ERP study</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>Given finite attentional resources, how emotional aspects of stimuli are processed automatically is controversial. Present study examined the time-course for automatic processing of facial expression by assessing N170, and late positive potentials (LPPs) of event-related potentials (ERPs) using a modified rapid serial visual presentation (RSVP) paradigm. Observers were required to confirm a certain house image and to detect whether a face image was presented at the end of a series of pictures. There were no significant main effects on emotional type for P1 amplitudes, whereas happy and fearful expressions elicited larger N170 amplitudes than neutral expressions. Significantly different LPP amplitudes were elicited depending on the type of emotional facial expressions (fear > happy > neutral). These results indicated that threatening priority was absent but discrimination of expressive vs. neutral faces occurred in implicit emotional tasks, at approximately 250 ms post-stimulus. Moreover, the three types of expressions were discriminated during the later stages of processing. Encoding emotional information of faces can be automated to a relatively higher degree, when attentional resources are mostly allocated to superficial analyzing.</description><subject>Activation</subject><subject>Amplitudes</subject><subject>Analysis</subject><subject>Analysis of Variance</subject><subject>Anxiety</subject><subject>Attention</subject><subject>Auditory evoked potentials</subject><subject>Automation</subject><subject>Biology and Life Sciences</subject><subject>Brain</subject><subject>Business administration</subject><subject>Classification</subject><subject>Cognition</subject><subject>Cognitive ability</subject><subject>Colleges & universities</subject><subject>Computer and Information Sciences</subject><subject>Cortex (occipital)</subject><subject>Cortex (temporal)</subject><subject>Decoding</subject><subject>Discrimination</subject><subject>EEG</subject><subject>Electroencephalography - methods</subject><subject>Electrooculography - methods</subject><subject>Electrophysiology</subject><subject>Emotional behavior</subject><subject>Emotions</subject><subject>Emotions - physiology</subject><subject>Engineering</subject><subject>Event-related potentials</subject><subject>Evoked potentials</subject><subject>Evoked Potentials - physiology</subject><subject>Extracellular signal-regulated kinase</subject><subject>Face</subject><subject>Facial Expression</subject><subject>Fear</subject><subject>Female</subject><subject>Happiness</subject><subject>Hemispheric laterality</subject><subject>Human behavior</subject><subject>Humans</subject><subject>Laboratories</subject><subject>Magnetoencephalography</subject><subject>Male</subject><subject>Medical imaging</subject><subject>Medicine and Health Sciences</subject><subject>Mental disorders</subject><subject>Mental health</subject><subject>Nervous system</subject><subject>Neuroimaging</subject><subject>Neurology</subject><subject>Neurosciences</subject><subject>Perception</subject><subject>Perceptions</subject><subject>Photic Stimulation</subject><subject>Prefrontal cortex</subject><subject>Priming</subject><subject>Psychology</subject><subject>Psychomotor Performance - physiology</subject><subject>Rapid serial visual presentation</subject><subject>Reaction Time - physiology</subject><subject>Research and Analysis Methods</subject><subject>Separation</subject><subject>Social Sciences</subject><subject>Temporal lobe</subject><subject>Visual cortex</subject><subject>Visual discrimination</subject><subject>Visual perception</subject><subject>Young Adult</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqNkl2L1DAUhoso7rr6D0QLgujFjPlok9QLYVhWHVhcWb8uw2l7MtOx04xJKrv_3tTpLlPZCyk04eQ570nOeZPkKSVzyiV9s7G966Cd72yHc0JlRqi4lxzTgrOZYITfP9gfJY-83xCScyXEw-SIKZ4TRchx8uMT9g7atLLOYQsBfWpNaqBqYhCvdg69b2yX7pythm23SuveDQukNQaswnAawP98my669Ozyc-pDX18_Th4YaD0-GdeT5Nv7s6-nH2fnFx-Wp4vzWSUKFmaccjA5M6XIODCjVMEzFFwImZeMZczUUrGyFKQouaxFXmeF5PFX5goUEZKfJM_3urvWej32xGuqVExXVLFILPdEbWGjd67ZgrvWFhr9N2DdSoMLTdWiNpQryIyQpWEZ0BoEMqlyhgWFHMSg9W6s1pdbrCvsQmzeRHR60jVrvbK_dc5l7H4RBV6NAs7-6tEHvW18hW0LHdp-f-9hMGyo9eIf9O7XjdQK4gOazthYtxpE9SJTMuckjj9S8zuo-NW4bapoINPE-CTh9SQhMgGvwgp67_Xyy-X_sxffp-zLA3aN0Ia1t20_uMhPwWwPVs5679DcNpkSPfj_pht68L8e_R_Tnh0O6DbpxvD8DxB3_eo</recordid><startdate>20170328</startdate><enddate>20170328</enddate><creator>Sun, Luxi</creator><creator>Ren, Jie</creator><creator>He, Weijie</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-0385-8948</orcidid></search><sort><creationdate>20170328</creationdate><title>Neural correlates of facial expression processing during a detection task: An ERP study</title><author>Sun, Luxi ; Ren, Jie ; He, Weijie</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c692t-313af52fb643a2f88934e636675b2242fd782bb609b37d65d4973d49b58a80673</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Activation</topic><topic>Amplitudes</topic><topic>Analysis</topic><topic>Analysis of Variance</topic><topic>Anxiety</topic><topic>Attention</topic><topic>Auditory evoked potentials</topic><topic>Automation</topic><topic>Biology and Life Sciences</topic><topic>Brain</topic><topic>Business administration</topic><topic>Classification</topic><topic>Cognition</topic><topic>Cognitive ability</topic><topic>Colleges & universities</topic><topic>Computer and Information Sciences</topic><topic>Cortex (occipital)</topic><topic>Cortex (temporal)</topic><topic>Decoding</topic><topic>Discrimination</topic><topic>EEG</topic><topic>Electroencephalography - methods</topic><topic>Electrooculography - methods</topic><topic>Electrophysiology</topic><topic>Emotional behavior</topic><topic>Emotions</topic><topic>Emotions - physiology</topic><topic>Engineering</topic><topic>Event-related potentials</topic><topic>Evoked potentials</topic><topic>Evoked Potentials - physiology</topic><topic>Extracellular signal-regulated kinase</topic><topic>Face</topic><topic>Facial Expression</topic><topic>Fear</topic><topic>Female</topic><topic>Happiness</topic><topic>Hemispheric laterality</topic><topic>Human behavior</topic><topic>Humans</topic><topic>Laboratories</topic><topic>Magnetoencephalography</topic><topic>Male</topic><topic>Medical imaging</topic><topic>Medicine and Health Sciences</topic><topic>Mental disorders</topic><topic>Mental health</topic><topic>Nervous system</topic><topic>Neuroimaging</topic><topic>Neurology</topic><topic>Neurosciences</topic><topic>Perception</topic><topic>Perceptions</topic><topic>Photic Stimulation</topic><topic>Prefrontal cortex</topic><topic>Priming</topic><topic>Psychology</topic><topic>Psychomotor Performance - physiology</topic><topic>Rapid serial visual presentation</topic><topic>Reaction Time - physiology</topic><topic>Research and Analysis Methods</topic><topic>Separation</topic><topic>Social Sciences</topic><topic>Temporal lobe</topic><topic>Visual cortex</topic><topic>Visual discrimination</topic><topic>Visual perception</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Sun, Luxi</creatorcontrib><creatorcontrib>Ren, Jie</creatorcontrib><creatorcontrib>He, Weijie</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale in Context : Opposing Viewpoints</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>ProQuest Nursing and Allied Health Journals</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>ProQuest Health and Medical</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing & Allied Health Database (Alumni Edition)</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>ProQuest Biological Science Journals</collection><collection>Engineering Database</collection><collection>Nursing & Allied Health Premium</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sun, Luxi</au><au>Ren, Jie</au><au>He, Weijie</au><au>Luo, Wenbo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Neural correlates of facial expression processing during a detection task: An ERP study</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2017-03-28</date><risdate>2017</risdate><volume>12</volume><issue>3</issue><spage>e0174016</spage><epage>e0174016</epage><pages>e0174016-e0174016</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>Given finite attentional resources, how emotional aspects of stimuli are processed automatically is controversial. Present study examined the time-course for automatic processing of facial expression by assessing N170, and late positive potentials (LPPs) of event-related potentials (ERPs) using a modified rapid serial visual presentation (RSVP) paradigm. Observers were required to confirm a certain house image and to detect whether a face image was presented at the end of a series of pictures. There were no significant main effects on emotional type for P1 amplitudes, whereas happy and fearful expressions elicited larger N170 amplitudes than neutral expressions. Significantly different LPP amplitudes were elicited depending on the type of emotional facial expressions (fear > happy > neutral). These results indicated that threatening priority was absent but discrimination of expressive vs. neutral faces occurred in implicit emotional tasks, at approximately 250 ms post-stimulus. Moreover, the three types of expressions were discriminated during the later stages of processing. Encoding emotional information of faces can be automated to a relatively higher degree, when attentional resources are mostly allocated to superficial analyzing.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>28350800</pmid><doi>10.1371/journal.pone.0174016</doi><tpages>e0174016</tpages><orcidid>https://orcid.org/0000-0002-0385-8948</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1932-6203 |
ispartof | PloS one, 2017-03, Vol.12 (3), p.e0174016-e0174016 |
issn | 1932-6203 1932-6203 |
language | eng |
recordid | cdi_plos_journals_1882248182 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Public Library of Science (PLoS) Journals Open Access; EZB-FREE-00999 freely available EZB journals; PubMed Central; Free Full-Text Journals in Chemistry |
subjects | Activation Amplitudes Analysis Analysis of Variance Anxiety Attention Auditory evoked potentials Automation Biology and Life Sciences Brain Business administration Classification Cognition Cognitive ability Colleges & universities Computer and Information Sciences Cortex (occipital) Cortex (temporal) Decoding Discrimination EEG Electroencephalography - methods Electrooculography - methods Electrophysiology Emotional behavior Emotions Emotions - physiology Engineering Event-related potentials Evoked potentials Evoked Potentials - physiology Extracellular signal-regulated kinase Face Facial Expression Fear Female Happiness Hemispheric laterality Human behavior Humans Laboratories Magnetoencephalography Male Medical imaging Medicine and Health Sciences Mental disorders Mental health Nervous system Neuroimaging Neurology Neurosciences Perception Perceptions Photic Stimulation Prefrontal cortex Priming Psychology Psychomotor Performance - physiology Rapid serial visual presentation Reaction Time - physiology Research and Analysis Methods Separation Social Sciences Temporal lobe Visual cortex Visual discrimination Visual perception Young Adult |
title | Neural correlates of facial expression processing during a detection task: An ERP study |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T00%3A26%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Neural%20correlates%20of%20facial%20expression%20processing%20during%20a%20detection%20task:%20An%20ERP%20study&rft.jtitle=PloS%20one&rft.au=Sun,%20Luxi&rft.date=2017-03-28&rft.volume=12&rft.issue=3&rft.spage=e0174016&rft.epage=e0174016&rft.pages=e0174016-e0174016&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0174016&rft_dat=%3Cgale_plos_%3EA487530203%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1882248182&rft_id=info:pmid/28350800&rft_galeid=A487530203&rft_doaj_id=oai_doaj_org_article_f138a4f67bf24a1da6e27852e91a5a62&rfr_iscdi=true |