Automatic Facial Feature Correspondence Based on Pose Estimation

Establishing facial feature correspondence across image of the same subject with different poses is an essential issue in the field of face image interpretation. Traditional approaches involve tedious landmark labeling and time-consuming training process. Instead, an automatic facial feature corresp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Ying Chen, Chunjian Hua
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 156
container_issue
container_start_page 153
container_title
container_volume 3
creator Ying Chen
Chunjian Hua
description Establishing facial feature correspondence across image of the same subject with different poses is an essential issue in the field of face image interpretation. Traditional approaches involve tedious landmark labeling and time-consuming training process. Instead, an automatic facial feature correspondence method is proposed in the paper, which achieves accurate feature correspondence via sparse feature matching using unary and geometric constraints, and intelligent interpolation based on facial pose angle estimation. The experiments show the validity of the proposed method.
doi_str_mv 10.1109/ETCS.2010.614
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5459740</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5459740</ieee_id><sourcerecordid>5459740</sourcerecordid><originalsourceid>FETCH-LOGICAL-i90t-d1c1ac0718fb4c9a054274ff4515fd2be34ab88b4e9cf5c468fb01d0641efd593</originalsourceid><addsrcrecordid>eNpFTs9LwzAYjchAN3f05CX_QOeX9Eua3JxlVWGg4O4jTb5AZTaj6Q7-93Yo7F0eD94vxu4FrIQA-7jZ1Z8rCZPUAq_YXKBE1KWxcH0RRs7Y_GyyAFqqG7bM-QsmoJJamFv2tD6N6duNneeN85078IbceBqI12kYKB9TH6j3xJ9dpsBTzz9SJr7JY3dOpf6OzaI7ZFr-84Ltmunaa7F9f3mr19uiszAWQXjhPFTCxBa9daBQVhgjKqFikC2V6FpjWiTro_KoJx-IABoFxaBsuWAPf7UdEe2Pw7Q-_OwVKlshlL8PVErp</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Automatic Facial Feature Correspondence Based on Pose Estimation</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Ying Chen ; Chunjian Hua</creator><creatorcontrib>Ying Chen ; Chunjian Hua</creatorcontrib><description>Establishing facial feature correspondence across image of the same subject with different poses is an essential issue in the field of face image interpretation. Traditional approaches involve tedious landmark labeling and time-consuming training process. Instead, an automatic facial feature correspondence method is proposed in the paper, which achieves accurate feature correspondence via sparse feature matching using unary and geometric constraints, and intelligent interpolation based on facial pose angle estimation. The experiments show the validity of the proposed method.</description><identifier>ISBN: 1424463882</identifier><identifier>ISBN: 9781424463886</identifier><identifier>EISBN: 1424463890</identifier><identifier>EISBN: 9781424463893</identifier><identifier>DOI: 10.1109/ETCS.2010.614</identifier><identifier>LCCN: 2010900625</identifier><language>eng</language><publisher>IEEE</publisher><subject>Active shape model ; Computer vision ; Detectors ; Eyes ; Face detection ; Facial Feature annotation ; Facial features ; Feature extraction ; graph matching ; Interpolation ; Mouth ; Nose</subject><ispartof>2010 Second International Workshop on Education Technology and Computer Science, 2010, Vol.3, p.153-156</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5459740$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,777,781,786,787,2052,27906,54901</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5459740$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Ying Chen</creatorcontrib><creatorcontrib>Chunjian Hua</creatorcontrib><title>Automatic Facial Feature Correspondence Based on Pose Estimation</title><title>2010 Second International Workshop on Education Technology and Computer Science</title><addtitle>ETCS</addtitle><description>Establishing facial feature correspondence across image of the same subject with different poses is an essential issue in the field of face image interpretation. Traditional approaches involve tedious landmark labeling and time-consuming training process. Instead, an automatic facial feature correspondence method is proposed in the paper, which achieves accurate feature correspondence via sparse feature matching using unary and geometric constraints, and intelligent interpolation based on facial pose angle estimation. The experiments show the validity of the proposed method.</description><subject>Active shape model</subject><subject>Computer vision</subject><subject>Detectors</subject><subject>Eyes</subject><subject>Face detection</subject><subject>Facial Feature annotation</subject><subject>Facial features</subject><subject>Feature extraction</subject><subject>graph matching</subject><subject>Interpolation</subject><subject>Mouth</subject><subject>Nose</subject><isbn>1424463882</isbn><isbn>9781424463886</isbn><isbn>1424463890</isbn><isbn>9781424463893</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2010</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpFTs9LwzAYjchAN3f05CX_QOeX9Eua3JxlVWGg4O4jTb5AZTaj6Q7-93Yo7F0eD94vxu4FrIQA-7jZ1Z8rCZPUAq_YXKBE1KWxcH0RRs7Y_GyyAFqqG7bM-QsmoJJamFv2tD6N6duNneeN85078IbceBqI12kYKB9TH6j3xJ9dpsBTzz9SJr7JY3dOpf6OzaI7ZFr-84Ltmunaa7F9f3mr19uiszAWQXjhPFTCxBa9daBQVhgjKqFikC2V6FpjWiTro_KoJx-IABoFxaBsuWAPf7UdEe2Pw7Q-_OwVKlshlL8PVErp</recordid><startdate>201003</startdate><enddate>201003</enddate><creator>Ying Chen</creator><creator>Chunjian Hua</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201003</creationdate><title>Automatic Facial Feature Correspondence Based on Pose Estimation</title><author>Ying Chen ; Chunjian Hua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i90t-d1c1ac0718fb4c9a054274ff4515fd2be34ab88b4e9cf5c468fb01d0641efd593</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Active shape model</topic><topic>Computer vision</topic><topic>Detectors</topic><topic>Eyes</topic><topic>Face detection</topic><topic>Facial Feature annotation</topic><topic>Facial features</topic><topic>Feature extraction</topic><topic>graph matching</topic><topic>Interpolation</topic><topic>Mouth</topic><topic>Nose</topic><toplevel>online_resources</toplevel><creatorcontrib>Ying Chen</creatorcontrib><creatorcontrib>Chunjian Hua</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ying Chen</au><au>Chunjian Hua</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Automatic Facial Feature Correspondence Based on Pose Estimation</atitle><btitle>2010 Second International Workshop on Education Technology and Computer Science</btitle><stitle>ETCS</stitle><date>2010-03</date><risdate>2010</risdate><volume>3</volume><spage>153</spage><epage>156</epage><pages>153-156</pages><isbn>1424463882</isbn><isbn>9781424463886</isbn><eisbn>1424463890</eisbn><eisbn>9781424463893</eisbn><abstract>Establishing facial feature correspondence across image of the same subject with different poses is an essential issue in the field of face image interpretation. Traditional approaches involve tedious landmark labeling and time-consuming training process. Instead, an automatic facial feature correspondence method is proposed in the paper, which achieves accurate feature correspondence via sparse feature matching using unary and geometric constraints, and intelligent interpolation based on facial pose angle estimation. The experiments show the validity of the proposed method.</abstract><pub>IEEE</pub><doi>10.1109/ETCS.2010.614</doi><tpages>4</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 1424463882
ispartof 2010 Second International Workshop on Education Technology and Computer Science, 2010, Vol.3, p.153-156
issn
language eng
recordid cdi_ieee_primary_5459740
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Active shape model
Computer vision
Detectors
Eyes
Face detection
Facial Feature annotation
Facial features
Feature extraction
graph matching
Interpolation
Mouth
Nose
title Automatic Facial Feature Correspondence Based on Pose Estimation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T19%3A10%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Automatic%20Facial%20Feature%20Correspondence%20Based%20on%20Pose%20Estimation&rft.btitle=2010%20Second%20International%20Workshop%20on%20Education%20Technology%20and%20Computer%20Science&rft.au=Ying%20Chen&rft.date=2010-03&rft.volume=3&rft.spage=153&rft.epage=156&rft.pages=153-156&rft.isbn=1424463882&rft.isbn_list=9781424463886&rft_id=info:doi/10.1109/ETCS.2010.614&rft_dat=%3Cieee_6IE%3E5459740%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=1424463890&rft.eisbn_list=9781424463893&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5459740&rfr_iscdi=true