Personalised face neutralisation based on subspace bilinear regression
Expression face neutralisation helps to improve the performance of expressive face recognition with one single neutral sample in gallery per subject. For learning-based expression neutralisation, the virtual neutral face totally relies on training samples, which removes person-specific characters fr...
Gespeichert in:
Veröffentlicht in: | IET computer vision 2014-08, Vol.8 (4), p.329-337 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 337 |
---|---|
container_issue | 4 |
container_start_page | 329 |
container_title | IET computer vision |
container_volume | 8 |
creator | Chen, Ying Bai, Ruilin Hua, Chunjian |
description | Expression face neutralisation helps to improve the performance of expressive face recognition with one single neutral sample in gallery per subject. For learning-based expression neutralisation, the virtual neutral face totally relies on training samples, which removes person-specific characters from the neutralised face. Bilinear kernel rank reduced regression (BKRRR) algorithm is designed in a virtual subspace to simultaneously and efficiently generate both virtual expressive and neutral images from training samples. An expression mask is then established using grey and gradient differences of the two images. The test expression image is transformed to neutral template by piece-wise affine warp (PAW). Using the virtual BKRRR neutral image as source, the PAW image as destination and the area covered by expression mask as clone area, an image fusion strategy based on Poisson equation is then designed, which achieves virtual neutralised face image with person-specific characters preserved. From experiments on the CMU Multi-PIE databases, it could be observed that the neutral faces synthesised by the proposed method could effectively approximate the real ground truth expressive faces, and greatly improve the performance of classic face recognition algorithms on expression variant problems. |
doi_str_mv | 10.1049/iet-cvi.2013.0212 |
format | Article |
fullrecord | <record><control><sourceid>proquest_24P</sourceid><recordid>TN_cdi_pascalfrancis_primary_28601360</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_2607e41a85eb40318aac7e0b727c0a50</doaj_id><sourcerecordid>3415616221</sourcerecordid><originalsourceid>FETCH-LOGICAL-c5683-cc52adf73d400adcb963235b60a832e47d58b848cd13391adf8d31ab2d330b8a3</originalsourceid><addsrcrecordid>eNqFkd1rFDEUxQdRsLb-Ab4tiKAPs735mmR806VrFwr6UH0NNx9Tskxn1mSm0v_exClbEdGnXG5-5-SQU1WvCKwJ8PY8-Km2d2FNgbA1UEKfVCdEClK3DYenx5nR59WLlPYAomlbflJtv_iYxgH7kLxbdWj9avDzFMsCpzAOK4PlJg9pNulQABP6MHiMq-hvok8pU2fVsw775F8-nKfV1-3F9eayvvr8abf5cFVb0ShWWysouk4yxwHQWVMSMWEaQMWo59IJZRRX1hHGWpJR5RhBQx1jYBSy02q3-LoR9_oQwy3Gez1i0L8WY7zRGKdge69pA9Jzgkp4w4ERhWilByOptIACstfbxesQx--zT5O-Dcn6vsfBj3PSpKEAsqGSZfT1H-h-nGP-tUwJIRlngqtMkYWycUwp-u4YkIAuLencks4t6dKSLi1lzZsHZ0wW-y7iYEM6CqlqMtqUsO8X7kfo_f3_jfXm245-3AKAKvHrRVywx-T_CPXuL_zu4rq4_vbGwXXsJ2IDwK4</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1557343548</pqid></control><display><type>article</type><title>Personalised face neutralisation based on subspace bilinear regression</title><source>Wiley Online Library Open Access</source><creator>Chen, Ying ; Bai, Ruilin ; Hua, Chunjian</creator><creatorcontrib>Chen, Ying ; Bai, Ruilin ; Hua, Chunjian</creatorcontrib><description>Expression face neutralisation helps to improve the performance of expressive face recognition with one single neutral sample in gallery per subject. For learning-based expression neutralisation, the virtual neutral face totally relies on training samples, which removes person-specific characters from the neutralised face. Bilinear kernel rank reduced regression (BKRRR) algorithm is designed in a virtual subspace to simultaneously and efficiently generate both virtual expressive and neutral images from training samples. An expression mask is then established using grey and gradient differences of the two images. The test expression image is transformed to neutral template by piece-wise affine warp (PAW). Using the virtual BKRRR neutral image as source, the PAW image as destination and the area covered by expression mask as clone area, an image fusion strategy based on Poisson equation is then designed, which achieves virtual neutralised face image with person-specific characters preserved. From experiments on the CMU Multi-PIE databases, it could be observed that the neutral faces synthesised by the proposed method could effectively approximate the real ground truth expressive faces, and greatly improve the performance of classic face recognition algorithms on expression variant problems.</description><identifier>ISSN: 1751-9632</identifier><identifier>ISSN: 1751-9640</identifier><identifier>EISSN: 1751-9640</identifier><identifier>DOI: 10.1049/iet-cvi.2013.0212</identifier><language>eng</language><publisher>Stevenage: The Institution of Engineering and Technology</publisher><subject>Algorithms ; Applied sciences ; bilinear kernel rank reduced regression ; BKRRR algorithm ; CMU multiPIE databases ; Computer vision ; Exact sciences and technology ; expressive face recognition ; face recognition ; gradient differences ; gradient methods ; grey differences ; image colour analysis ; image fusion ; Image processing ; Information, signal and communications theory ; learning (artificial intelligence) ; learning-based expression neutralisation ; Masks ; Pattern recognition ; PAW ; Performance enhancement ; personalised face neutralisation ; piece-wise affine warp ; Poisson equation ; Regression ; regression analysis ; Signal processing ; subspace bilinear regression ; Subspaces ; Telecommunications and information theory ; Training ; virtual neutral face</subject><ispartof>IET computer vision, 2014-08, Vol.8 (4), p.329-337</ispartof><rights>The Institution of Engineering and Technology</rights><rights>2014 The Institution of Engineering and Technology</rights><rights>2015 INIST-CNRS</rights><rights>Copyright The Institution of Engineering & Technology Aug 2014</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c5683-cc52adf73d400adcb963235b60a832e47d58b848cd13391adf8d31ab2d330b8a3</citedby><cites>FETCH-LOGICAL-c5683-cc52adf73d400adcb963235b60a832e47d58b848cd13391adf8d31ab2d330b8a3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1049%2Fiet-cvi.2013.0212$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1049%2Fiet-cvi.2013.0212$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,777,781,1412,11543,27905,27906,45555,45556,46033,46457</link.rule.ids><linktorsrc>$$Uhttps://onlinelibrary.wiley.com/doi/abs/10.1049%2Fiet-cvi.2013.0212$$EView_record_in_Wiley-Blackwell$$FView_record_in_$$GWiley-Blackwell</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=28601360$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Chen, Ying</creatorcontrib><creatorcontrib>Bai, Ruilin</creatorcontrib><creatorcontrib>Hua, Chunjian</creatorcontrib><title>Personalised face neutralisation based on subspace bilinear regression</title><title>IET computer vision</title><description>Expression face neutralisation helps to improve the performance of expressive face recognition with one single neutral sample in gallery per subject. For learning-based expression neutralisation, the virtual neutral face totally relies on training samples, which removes person-specific characters from the neutralised face. Bilinear kernel rank reduced regression (BKRRR) algorithm is designed in a virtual subspace to simultaneously and efficiently generate both virtual expressive and neutral images from training samples. An expression mask is then established using grey and gradient differences of the two images. The test expression image is transformed to neutral template by piece-wise affine warp (PAW). Using the virtual BKRRR neutral image as source, the PAW image as destination and the area covered by expression mask as clone area, an image fusion strategy based on Poisson equation is then designed, which achieves virtual neutralised face image with person-specific characters preserved. From experiments on the CMU Multi-PIE databases, it could be observed that the neutral faces synthesised by the proposed method could effectively approximate the real ground truth expressive faces, and greatly improve the performance of classic face recognition algorithms on expression variant problems.</description><subject>Algorithms</subject><subject>Applied sciences</subject><subject>bilinear kernel rank reduced regression</subject><subject>BKRRR algorithm</subject><subject>CMU multiPIE databases</subject><subject>Computer vision</subject><subject>Exact sciences and technology</subject><subject>expressive face recognition</subject><subject>face recognition</subject><subject>gradient differences</subject><subject>gradient methods</subject><subject>grey differences</subject><subject>image colour analysis</subject><subject>image fusion</subject><subject>Image processing</subject><subject>Information, signal and communications theory</subject><subject>learning (artificial intelligence)</subject><subject>learning-based expression neutralisation</subject><subject>Masks</subject><subject>Pattern recognition</subject><subject>PAW</subject><subject>Performance enhancement</subject><subject>personalised face neutralisation</subject><subject>piece-wise affine warp</subject><subject>Poisson equation</subject><subject>Regression</subject><subject>regression analysis</subject><subject>Signal processing</subject><subject>subspace bilinear regression</subject><subject>Subspaces</subject><subject>Telecommunications and information theory</subject><subject>Training</subject><subject>virtual neutral face</subject><issn>1751-9632</issn><issn>1751-9640</issn><issn>1751-9640</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqFkd1rFDEUxQdRsLb-Ab4tiKAPs735mmR806VrFwr6UH0NNx9Tskxn1mSm0v_exClbEdGnXG5-5-SQU1WvCKwJ8PY8-Km2d2FNgbA1UEKfVCdEClK3DYenx5nR59WLlPYAomlbflJtv_iYxgH7kLxbdWj9avDzFMsCpzAOK4PlJg9pNulQABP6MHiMq-hvok8pU2fVsw775F8-nKfV1-3F9eayvvr8abf5cFVb0ShWWysouk4yxwHQWVMSMWEaQMWo59IJZRRX1hHGWpJR5RhBQx1jYBSy02q3-LoR9_oQwy3Gez1i0L8WY7zRGKdge69pA9Jzgkp4w4ERhWilByOptIACstfbxesQx--zT5O-Dcn6vsfBj3PSpKEAsqGSZfT1H-h-nGP-tUwJIRlngqtMkYWycUwp-u4YkIAuLencks4t6dKSLi1lzZsHZ0wW-y7iYEM6CqlqMtqUsO8X7kfo_f3_jfXm245-3AKAKvHrRVywx-T_CPXuL_zu4rq4_vbGwXXsJ2IDwK4</recordid><startdate>201408</startdate><enddate>201408</enddate><creator>Chen, Ying</creator><creator>Bai, Ruilin</creator><creator>Hua, Chunjian</creator><general>The Institution of Engineering and Technology</general><general>Institution of Engineering and Technology</general><general>John Wiley & Sons, Inc</general><general>Wiley</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7XB</scope><scope>8AL</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L6V</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>S0W</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope></search><sort><creationdate>201408</creationdate><title>Personalised face neutralisation based on subspace bilinear regression</title><author>Chen, Ying ; Bai, Ruilin ; Hua, Chunjian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c5683-cc52adf73d400adcb963235b60a832e47d58b848cd13391adf8d31ab2d330b8a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Algorithms</topic><topic>Applied sciences</topic><topic>bilinear kernel rank reduced regression</topic><topic>BKRRR algorithm</topic><topic>CMU multiPIE databases</topic><topic>Computer vision</topic><topic>Exact sciences and technology</topic><topic>expressive face recognition</topic><topic>face recognition</topic><topic>gradient differences</topic><topic>gradient methods</topic><topic>grey differences</topic><topic>image colour analysis</topic><topic>image fusion</topic><topic>Image processing</topic><topic>Information, signal and communications theory</topic><topic>learning (artificial intelligence)</topic><topic>learning-based expression neutralisation</topic><topic>Masks</topic><topic>Pattern recognition</topic><topic>PAW</topic><topic>Performance enhancement</topic><topic>personalised face neutralisation</topic><topic>piece-wise affine warp</topic><topic>Poisson equation</topic><topic>Regression</topic><topic>regression analysis</topic><topic>Signal processing</topic><topic>subspace bilinear regression</topic><topic>Subspaces</topic><topic>Telecommunications and information theory</topic><topic>Training</topic><topic>virtual neutral face</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Chen, Ying</creatorcontrib><creatorcontrib>Bai, Ruilin</creatorcontrib><creatorcontrib>Hua, Chunjian</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>DELNET Engineering & Technology Collection</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IET computer vision</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Chen, Ying</au><au>Bai, Ruilin</au><au>Hua, Chunjian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Personalised face neutralisation based on subspace bilinear regression</atitle><jtitle>IET computer vision</jtitle><date>2014-08</date><risdate>2014</risdate><volume>8</volume><issue>4</issue><spage>329</spage><epage>337</epage><pages>329-337</pages><issn>1751-9632</issn><issn>1751-9640</issn><eissn>1751-9640</eissn><abstract>Expression face neutralisation helps to improve the performance of expressive face recognition with one single neutral sample in gallery per subject. For learning-based expression neutralisation, the virtual neutral face totally relies on training samples, which removes person-specific characters from the neutralised face. Bilinear kernel rank reduced regression (BKRRR) algorithm is designed in a virtual subspace to simultaneously and efficiently generate both virtual expressive and neutral images from training samples. An expression mask is then established using grey and gradient differences of the two images. The test expression image is transformed to neutral template by piece-wise affine warp (PAW). Using the virtual BKRRR neutral image as source, the PAW image as destination and the area covered by expression mask as clone area, an image fusion strategy based on Poisson equation is then designed, which achieves virtual neutralised face image with person-specific characters preserved. From experiments on the CMU Multi-PIE databases, it could be observed that the neutral faces synthesised by the proposed method could effectively approximate the real ground truth expressive faces, and greatly improve the performance of classic face recognition algorithms on expression variant problems.</abstract><cop>Stevenage</cop><pub>The Institution of Engineering and Technology</pub><doi>10.1049/iet-cvi.2013.0212</doi><tpages>9</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1751-9632 |
ispartof | IET computer vision, 2014-08, Vol.8 (4), p.329-337 |
issn | 1751-9632 1751-9640 1751-9640 |
language | eng |
recordid | cdi_pascalfrancis_primary_28601360 |
source | Wiley Online Library Open Access |
subjects | Algorithms Applied sciences bilinear kernel rank reduced regression BKRRR algorithm CMU multiPIE databases Computer vision Exact sciences and technology expressive face recognition face recognition gradient differences gradient methods grey differences image colour analysis image fusion Image processing Information, signal and communications theory learning (artificial intelligence) learning-based expression neutralisation Masks Pattern recognition PAW Performance enhancement personalised face neutralisation piece-wise affine warp Poisson equation Regression regression analysis Signal processing subspace bilinear regression Subspaces Telecommunications and information theory Training virtual neutral face |
title | Personalised face neutralisation based on subspace bilinear regression |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-18T20%3A52%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_24P&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Personalised%20face%20neutralisation%20based%20on%20subspace%20bilinear%20regression&rft.jtitle=IET%20computer%20vision&rft.au=Chen,%20Ying&rft.date=2014-08&rft.volume=8&rft.issue=4&rft.spage=329&rft.epage=337&rft.pages=329-337&rft.issn=1751-9632&rft.eissn=1751-9640&rft_id=info:doi/10.1049/iet-cvi.2013.0212&rft_dat=%3Cproquest_24P%3E3415616221%3C/proquest_24P%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1557343548&rft_id=info:pmid/&rft_doaj_id=oai_doaj_org_article_2607e41a85eb40318aac7e0b727c0a50&rfr_iscdi=true |