Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database
The subjective nature of pain makes it a very challenging phenomenon to assess. Most of the current pain assessment approaches rely on an individual's ability to recognise and report an observed pain episode. However, pain perception and expression are affected by numerous factors ranging from...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on affective computing 2021-07, Vol.12 (3), p.743-760 |
---|---|
Hauptverfasser: | , , , , , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 760 |
---|---|
container_issue | 3 |
container_start_page | 743 |
container_title | IEEE transactions on affective computing |
container_volume | 12 |
creator | Thiam, Patrick Kessler, Viktor Amirian, Mohammadreza Bellmann, Peter Layher, Georg Zhang, Yan Velana, Maria Gruss, Sascha Walter, Steffen Traue, Harald C. Schork, Daniel Kim, Jonghwa Andre, Elisabeth Neumann, Heiko Schwenker, Friedhelm |
description | The subjective nature of pain makes it a very challenging phenomenon to assess. Most of the current pain assessment approaches rely on an individual's ability to recognise and report an observed pain episode. However, pain perception and expression are affected by numerous factors ranging from personality traits to physical and psychological health state. Hence, several approaches have been proposed for the automatic recognition of pain intensity, based on measurable physiological and audiovisual parameters. In the current paper, an assessment of several fusion architectures for the development of a multi-modal pain intensity classification system is performed. The contribution of the presented work is two-fold: (1) 3 distinctive modalities consisting of audio, video and physiological channels are assessed and combined for the classification of several levels of pain elicitation. (2) An extensive assessment of several fusion strategies is carried out in order to design a classification architecture that improves the performance of the pain recognition system. The assessment is based on the SenseEmotion Database and experimental validation demonstrates the relevance of the multi-modal classification approach, which achieves classification rates of respectively 83.39\% 83.39% , 59.53\% 59.53% and 43.89\% 43.89% in a 2-class, 3-class and 4-class pain intensity classification task. |
doi_str_mv | 10.1109/TAFFC.2019.2892090 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2568777827</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8607037</ieee_id><sourcerecordid>2568777827</sourcerecordid><originalsourceid>FETCH-LOGICAL-c393t-d5e14eea336784a35cee63b8ae25873244e243b15ce7874600cc4ed78017f9833</originalsourceid><addsrcrecordid>eNpNUE1PAjEUbIwmEuQP6GUTz4ttX3fbHhFBSSAaxXNTdh9aAlvclgP_3vIR47u8yZuZN8kQcstonzGqH-aD8XjY55TpPleaU00vSIdpoXOgorj8h69JL4QVTQMAJZcdMpnt1tHlM1_bdfZmXZNNmohNcHGfvWPlvxoXnW-yRxuwzhKI35h9JAGONv7IPNloF4m9IVdLuw7YO-8u-RyP5sOXfPr6PBkOpnkFGmJeF8gEok35UgkLRYVYwkJZ5IWSwIVALmDB0l0qKUpKq0pgLRVlcqkVQJfcn_5uW_-zwxDNyu_aJkUaXpRKSqm4TCp-UlWtD6HFpdm2bmPbvWHUHFozx9bMoTVzbi2Z7k4mh4h_BlVSSUHCL6fNZwU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2568777827</pqid></control><display><type>article</type><title>Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database</title><source>IEEE Electronic Library (IEL)</source><creator>Thiam, Patrick ; Kessler, Viktor ; Amirian, Mohammadreza ; Bellmann, Peter ; Layher, Georg ; Zhang, Yan ; Velana, Maria ; Gruss, Sascha ; Walter, Steffen ; Traue, Harald C. ; Schork, Daniel ; Kim, Jonghwa ; Andre, Elisabeth ; Neumann, Heiko ; Schwenker, Friedhelm</creator><creatorcontrib>Thiam, Patrick ; Kessler, Viktor ; Amirian, Mohammadreza ; Bellmann, Peter ; Layher, Georg ; Zhang, Yan ; Velana, Maria ; Gruss, Sascha ; Walter, Steffen ; Traue, Harald C. ; Schork, Daniel ; Kim, Jonghwa ; Andre, Elisabeth ; Neumann, Heiko ; Schwenker, Friedhelm</creatorcontrib><description><![CDATA[The subjective nature of pain makes it a very challenging phenomenon to assess. Most of the current pain assessment approaches rely on an individual's ability to recognise and report an observed pain episode. However, pain perception and expression are affected by numerous factors ranging from personality traits to physical and psychological health state. Hence, several approaches have been proposed for the automatic recognition of pain intensity, based on measurable physiological and audiovisual parameters. In the current paper, an assessment of several fusion architectures for the development of a multi-modal pain intensity classification system is performed. The contribution of the presented work is two-fold: (1) 3 distinctive modalities consisting of audio, video and physiological channels are assessed and combined for the classification of several levels of pain elicitation. (2) An extensive assessment of several fusion strategies is carried out in order to design a classification architecture that improves the performance of the pain recognition system. The assessment is based on the SenseEmotion Database and experimental validation demonstrates the relevance of the multi-modal classification approach, which achieves classification rates of respectively <inline-formula><tex-math notation="LaTeX">83.39\%</tex-math> <mml:math><mml:mrow><mml:mn>83</mml:mn><mml:mo>.</mml:mo><mml:mn>39</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq1-2892090.gif"/> </inline-formula>, <inline-formula><tex-math notation="LaTeX">59.53\%</tex-math> <mml:math><mml:mrow><mml:mn>59</mml:mn><mml:mo>.</mml:mo><mml:mn>53</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq2-2892090.gif"/> </inline-formula> and <inline-formula><tex-math notation="LaTeX">43.89\%</tex-math> <mml:math><mml:mrow><mml:mn>43</mml:mn><mml:mo>.</mml:mo><mml:mn>89</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq3-2892090.gif"/> </inline-formula> in a 2-class, 3-class and 4-class pain intensity classification task.]]></description><identifier>ISSN: 1949-3045</identifier><identifier>EISSN: 1949-3045</identifier><identifier>DOI: 10.1109/TAFFC.2019.2892090</identifier><identifier>CODEN: ITACBQ</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Classification ; Computer architecture ; Electromyography ; Feature extraction ; multi-modal information fusion ; multiple classifier systems ; Pain ; Pain intensity recognition ; Physiology ; Recognition ; Reliability ; signal processing ; Video data</subject><ispartof>IEEE transactions on affective computing, 2021-07, Vol.12 (3), p.743-760</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c393t-d5e14eea336784a35cee63b8ae25873244e243b15ce7874600cc4ed78017f9833</citedby><cites>FETCH-LOGICAL-c393t-d5e14eea336784a35cee63b8ae25873244e243b15ce7874600cc4ed78017f9833</cites><orcidid>0000-0002-2367-162X ; 0000-0003-0047-6802 ; 0000-0001-5118-0812 ; 0000-0002-4846-6921 ; 0000-0003-2128-0606 ; 0000-0002-6769-8410 ; 0000-0003-0182-4469 ; 0000-0001-6273-5334 ; 0000-0002-6350-6872</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8607037$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/8607037$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Thiam, Patrick</creatorcontrib><creatorcontrib>Kessler, Viktor</creatorcontrib><creatorcontrib>Amirian, Mohammadreza</creatorcontrib><creatorcontrib>Bellmann, Peter</creatorcontrib><creatorcontrib>Layher, Georg</creatorcontrib><creatorcontrib>Zhang, Yan</creatorcontrib><creatorcontrib>Velana, Maria</creatorcontrib><creatorcontrib>Gruss, Sascha</creatorcontrib><creatorcontrib>Walter, Steffen</creatorcontrib><creatorcontrib>Traue, Harald C.</creatorcontrib><creatorcontrib>Schork, Daniel</creatorcontrib><creatorcontrib>Kim, Jonghwa</creatorcontrib><creatorcontrib>Andre, Elisabeth</creatorcontrib><creatorcontrib>Neumann, Heiko</creatorcontrib><creatorcontrib>Schwenker, Friedhelm</creatorcontrib><title>Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database</title><title>IEEE transactions on affective computing</title><addtitle>TAFFC</addtitle><description><![CDATA[The subjective nature of pain makes it a very challenging phenomenon to assess. Most of the current pain assessment approaches rely on an individual's ability to recognise and report an observed pain episode. However, pain perception and expression are affected by numerous factors ranging from personality traits to physical and psychological health state. Hence, several approaches have been proposed for the automatic recognition of pain intensity, based on measurable physiological and audiovisual parameters. In the current paper, an assessment of several fusion architectures for the development of a multi-modal pain intensity classification system is performed. The contribution of the presented work is two-fold: (1) 3 distinctive modalities consisting of audio, video and physiological channels are assessed and combined for the classification of several levels of pain elicitation. (2) An extensive assessment of several fusion strategies is carried out in order to design a classification architecture that improves the performance of the pain recognition system. The assessment is based on the SenseEmotion Database and experimental validation demonstrates the relevance of the multi-modal classification approach, which achieves classification rates of respectively <inline-formula><tex-math notation="LaTeX">83.39\%</tex-math> <mml:math><mml:mrow><mml:mn>83</mml:mn><mml:mo>.</mml:mo><mml:mn>39</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq1-2892090.gif"/> </inline-formula>, <inline-formula><tex-math notation="LaTeX">59.53\%</tex-math> <mml:math><mml:mrow><mml:mn>59</mml:mn><mml:mo>.</mml:mo><mml:mn>53</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq2-2892090.gif"/> </inline-formula> and <inline-formula><tex-math notation="LaTeX">43.89\%</tex-math> <mml:math><mml:mrow><mml:mn>43</mml:mn><mml:mo>.</mml:mo><mml:mn>89</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq3-2892090.gif"/> </inline-formula> in a 2-class, 3-class and 4-class pain intensity classification task.]]></description><subject>Classification</subject><subject>Computer architecture</subject><subject>Electromyography</subject><subject>Feature extraction</subject><subject>multi-modal information fusion</subject><subject>multiple classifier systems</subject><subject>Pain</subject><subject>Pain intensity recognition</subject><subject>Physiology</subject><subject>Recognition</subject><subject>Reliability</subject><subject>signal processing</subject><subject>Video data</subject><issn>1949-3045</issn><issn>1949-3045</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNUE1PAjEUbIwmEuQP6GUTz4ttX3fbHhFBSSAaxXNTdh9aAlvclgP_3vIR47u8yZuZN8kQcstonzGqH-aD8XjY55TpPleaU00vSIdpoXOgorj8h69JL4QVTQMAJZcdMpnt1tHlM1_bdfZmXZNNmohNcHGfvWPlvxoXnW-yRxuwzhKI35h9JAGONv7IPNloF4m9IVdLuw7YO-8u-RyP5sOXfPr6PBkOpnkFGmJeF8gEok35UgkLRYVYwkJZ5IWSwIVALmDB0l0qKUpKq0pgLRVlcqkVQJfcn_5uW_-zwxDNyu_aJkUaXpRKSqm4TCp-UlWtD6HFpdm2bmPbvWHUHFozx9bMoTVzbi2Z7k4mh4h_BlVSSUHCL6fNZwU</recordid><startdate>20210701</startdate><enddate>20210701</enddate><creator>Thiam, Patrick</creator><creator>Kessler, Viktor</creator><creator>Amirian, Mohammadreza</creator><creator>Bellmann, Peter</creator><creator>Layher, Georg</creator><creator>Zhang, Yan</creator><creator>Velana, Maria</creator><creator>Gruss, Sascha</creator><creator>Walter, Steffen</creator><creator>Traue, Harald C.</creator><creator>Schork, Daniel</creator><creator>Kim, Jonghwa</creator><creator>Andre, Elisabeth</creator><creator>Neumann, Heiko</creator><creator>Schwenker, Friedhelm</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-2367-162X</orcidid><orcidid>https://orcid.org/0000-0003-0047-6802</orcidid><orcidid>https://orcid.org/0000-0001-5118-0812</orcidid><orcidid>https://orcid.org/0000-0002-4846-6921</orcidid><orcidid>https://orcid.org/0000-0003-2128-0606</orcidid><orcidid>https://orcid.org/0000-0002-6769-8410</orcidid><orcidid>https://orcid.org/0000-0003-0182-4469</orcidid><orcidid>https://orcid.org/0000-0001-6273-5334</orcidid><orcidid>https://orcid.org/0000-0002-6350-6872</orcidid></search><sort><creationdate>20210701</creationdate><title>Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database</title><author>Thiam, Patrick ; Kessler, Viktor ; Amirian, Mohammadreza ; Bellmann, Peter ; Layher, Georg ; Zhang, Yan ; Velana, Maria ; Gruss, Sascha ; Walter, Steffen ; Traue, Harald C. ; Schork, Daniel ; Kim, Jonghwa ; Andre, Elisabeth ; Neumann, Heiko ; Schwenker, Friedhelm</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c393t-d5e14eea336784a35cee63b8ae25873244e243b15ce7874600cc4ed78017f9833</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Classification</topic><topic>Computer architecture</topic><topic>Electromyography</topic><topic>Feature extraction</topic><topic>multi-modal information fusion</topic><topic>multiple classifier systems</topic><topic>Pain</topic><topic>Pain intensity recognition</topic><topic>Physiology</topic><topic>Recognition</topic><topic>Reliability</topic><topic>signal processing</topic><topic>Video data</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Thiam, Patrick</creatorcontrib><creatorcontrib>Kessler, Viktor</creatorcontrib><creatorcontrib>Amirian, Mohammadreza</creatorcontrib><creatorcontrib>Bellmann, Peter</creatorcontrib><creatorcontrib>Layher, Georg</creatorcontrib><creatorcontrib>Zhang, Yan</creatorcontrib><creatorcontrib>Velana, Maria</creatorcontrib><creatorcontrib>Gruss, Sascha</creatorcontrib><creatorcontrib>Walter, Steffen</creatorcontrib><creatorcontrib>Traue, Harald C.</creatorcontrib><creatorcontrib>Schork, Daniel</creatorcontrib><creatorcontrib>Kim, Jonghwa</creatorcontrib><creatorcontrib>Andre, Elisabeth</creatorcontrib><creatorcontrib>Neumann, Heiko</creatorcontrib><creatorcontrib>Schwenker, Friedhelm</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on affective computing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Thiam, Patrick</au><au>Kessler, Viktor</au><au>Amirian, Mohammadreza</au><au>Bellmann, Peter</au><au>Layher, Georg</au><au>Zhang, Yan</au><au>Velana, Maria</au><au>Gruss, Sascha</au><au>Walter, Steffen</au><au>Traue, Harald C.</au><au>Schork, Daniel</au><au>Kim, Jonghwa</au><au>Andre, Elisabeth</au><au>Neumann, Heiko</au><au>Schwenker, Friedhelm</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database</atitle><jtitle>IEEE transactions on affective computing</jtitle><stitle>TAFFC</stitle><date>2021-07-01</date><risdate>2021</risdate><volume>12</volume><issue>3</issue><spage>743</spage><epage>760</epage><pages>743-760</pages><issn>1949-3045</issn><eissn>1949-3045</eissn><coden>ITACBQ</coden><abstract><![CDATA[The subjective nature of pain makes it a very challenging phenomenon to assess. Most of the current pain assessment approaches rely on an individual's ability to recognise and report an observed pain episode. However, pain perception and expression are affected by numerous factors ranging from personality traits to physical and psychological health state. Hence, several approaches have been proposed for the automatic recognition of pain intensity, based on measurable physiological and audiovisual parameters. In the current paper, an assessment of several fusion architectures for the development of a multi-modal pain intensity classification system is performed. The contribution of the presented work is two-fold: (1) 3 distinctive modalities consisting of audio, video and physiological channels are assessed and combined for the classification of several levels of pain elicitation. (2) An extensive assessment of several fusion strategies is carried out in order to design a classification architecture that improves the performance of the pain recognition system. The assessment is based on the SenseEmotion Database and experimental validation demonstrates the relevance of the multi-modal classification approach, which achieves classification rates of respectively <inline-formula><tex-math notation="LaTeX">83.39\%</tex-math> <mml:math><mml:mrow><mml:mn>83</mml:mn><mml:mo>.</mml:mo><mml:mn>39</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq1-2892090.gif"/> </inline-formula>, <inline-formula><tex-math notation="LaTeX">59.53\%</tex-math> <mml:math><mml:mrow><mml:mn>59</mml:mn><mml:mo>.</mml:mo><mml:mn>53</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq2-2892090.gif"/> </inline-formula> and <inline-formula><tex-math notation="LaTeX">43.89\%</tex-math> <mml:math><mml:mrow><mml:mn>43</mml:mn><mml:mo>.</mml:mo><mml:mn>89</mml:mn><mml:mo>%</mml:mo></mml:mrow></mml:math><inline-graphic xlink:href="thiam-ieq3-2892090.gif"/> </inline-formula> in a 2-class, 3-class and 4-class pain intensity classification task.]]></abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TAFFC.2019.2892090</doi><tpages>18</tpages><orcidid>https://orcid.org/0000-0002-2367-162X</orcidid><orcidid>https://orcid.org/0000-0003-0047-6802</orcidid><orcidid>https://orcid.org/0000-0001-5118-0812</orcidid><orcidid>https://orcid.org/0000-0002-4846-6921</orcidid><orcidid>https://orcid.org/0000-0003-2128-0606</orcidid><orcidid>https://orcid.org/0000-0002-6769-8410</orcidid><orcidid>https://orcid.org/0000-0003-0182-4469</orcidid><orcidid>https://orcid.org/0000-0001-6273-5334</orcidid><orcidid>https://orcid.org/0000-0002-6350-6872</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1949-3045 |
ispartof | IEEE transactions on affective computing, 2021-07, Vol.12 (3), p.743-760 |
issn | 1949-3045 1949-3045 |
language | eng |
recordid | cdi_proquest_journals_2568777827 |
source | IEEE Electronic Library (IEL) |
subjects | Classification Computer architecture Electromyography Feature extraction multi-modal information fusion multiple classifier systems Pain Pain intensity recognition Physiology Recognition Reliability signal processing Video data |
title | Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T13%3A30%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multi-Modal%20Pain%20Intensity%20Recognition%20Based%20on%20the%20SenseEmotion%20Database&rft.jtitle=IEEE%20transactions%20on%20affective%20computing&rft.au=Thiam,%20Patrick&rft.date=2021-07-01&rft.volume=12&rft.issue=3&rft.spage=743&rft.epage=760&rft.pages=743-760&rft.issn=1949-3045&rft.eissn=1949-3045&rft.coden=ITACBQ&rft_id=info:doi/10.1109/TAFFC.2019.2892090&rft_dat=%3Cproquest_RIE%3E2568777827%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2568777827&rft_id=info:pmid/&rft_ieee_id=8607037&rfr_iscdi=true |