NEURAL NETWORK BASED FACIAL ANALYSIS USING FACIAL LANDMARKS AND ASSOCIATED CONFIDENCE VALUES

Systems and methods for more accurate and robust determination of subject characteristics from an image of the subject. One or more machine learning models receive as input an image of a subject, and output both facial landmarks and associated confidence values. Confidence values represent the degre...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Molchanov, Pavlo, Yadawadkar, Sujay, Avadhanam, Niranjan, Puri, Nishant, Sah, Shagan, Shetty, Rajath, Arar, Nuri Murat
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Molchanov, Pavlo
Yadawadkar, Sujay
Avadhanam, Niranjan
Puri, Nishant
Sah, Shagan
Shetty, Rajath
Arar, Nuri Murat
description Systems and methods for more accurate and robust determination of subject characteristics from an image of the subject. One or more machine learning models receive as input an image of a subject, and output both facial landmarks and associated confidence values. Confidence values represent the degrees to which portions of the subject's face corresponding to those landmarks are occluded, i.e., the amount of uncertainty in the position of each landmark location. These landmark points and their associated confidence values, and/or associated information, may then be input to another set of one or more machine learning models which may output any facial analysis quantity or quantities, such as the subject's gaze direction, head pose, drowsiness state, cognitive load, or distraction state.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2021182625A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2021182625A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2021182625A13</originalsourceid><addsrcrecordid>eNqNyzELwjAQBeAuDqL-hwNnwUQU1zO5ami8Qi9RBKEUiZNoof5_zKC704PvvTcurkyxQQ9M4Vw3FexQyEKJxmVERn8RJxDF8f6nHtkesakk9xZQpM4c8svUXDpLbAhO6CPJtBjdu8eQZt-cFPOSgjksUv9q09B3t_RM7zaKXmqltnqj16hW_60-Wf4y-Q</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>NEURAL NETWORK BASED FACIAL ANALYSIS USING FACIAL LANDMARKS AND ASSOCIATED CONFIDENCE VALUES</title><source>esp@cenet</source><creator>Molchanov, Pavlo ; Yadawadkar, Sujay ; Avadhanam, Niranjan ; Puri, Nishant ; Sah, Shagan ; Shetty, Rajath ; Arar, Nuri Murat</creator><creatorcontrib>Molchanov, Pavlo ; Yadawadkar, Sujay ; Avadhanam, Niranjan ; Puri, Nishant ; Sah, Shagan ; Shetty, Rajath ; Arar, Nuri Murat</creatorcontrib><description>Systems and methods for more accurate and robust determination of subject characteristics from an image of the subject. One or more machine learning models receive as input an image of a subject, and output both facial landmarks and associated confidence values. Confidence values represent the degrees to which portions of the subject's face corresponding to those landmarks are occluded, i.e., the amount of uncertainty in the position of each landmark location. These landmark points and their associated confidence values, and/or associated information, may then be input to another set of one or more machine learning models which may output any facial analysis quantity or quantities, such as the subject's gaze direction, head pose, drowsiness state, cognitive load, or distraction state.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2021</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20210617&amp;DB=EPODOC&amp;CC=US&amp;NR=2021182625A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25563,76418</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20210617&amp;DB=EPODOC&amp;CC=US&amp;NR=2021182625A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Molchanov, Pavlo</creatorcontrib><creatorcontrib>Yadawadkar, Sujay</creatorcontrib><creatorcontrib>Avadhanam, Niranjan</creatorcontrib><creatorcontrib>Puri, Nishant</creatorcontrib><creatorcontrib>Sah, Shagan</creatorcontrib><creatorcontrib>Shetty, Rajath</creatorcontrib><creatorcontrib>Arar, Nuri Murat</creatorcontrib><title>NEURAL NETWORK BASED FACIAL ANALYSIS USING FACIAL LANDMARKS AND ASSOCIATED CONFIDENCE VALUES</title><description>Systems and methods for more accurate and robust determination of subject characteristics from an image of the subject. One or more machine learning models receive as input an image of a subject, and output both facial landmarks and associated confidence values. Confidence values represent the degrees to which portions of the subject's face corresponding to those landmarks are occluded, i.e., the amount of uncertainty in the position of each landmark location. These landmark points and their associated confidence values, and/or associated information, may then be input to another set of one or more machine learning models which may output any facial analysis quantity or quantities, such as the subject's gaze direction, head pose, drowsiness state, cognitive load, or distraction state.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2021</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNyzELwjAQBeAuDqL-hwNnwUQU1zO5ami8Qi9RBKEUiZNoof5_zKC704PvvTcurkyxQQ9M4Vw3FexQyEKJxmVERn8RJxDF8f6nHtkesakk9xZQpM4c8svUXDpLbAhO6CPJtBjdu8eQZt-cFPOSgjksUv9q09B3t_RM7zaKXmqltnqj16hW_60-Wf4y-Q</recordid><startdate>20210617</startdate><enddate>20210617</enddate><creator>Molchanov, Pavlo</creator><creator>Yadawadkar, Sujay</creator><creator>Avadhanam, Niranjan</creator><creator>Puri, Nishant</creator><creator>Sah, Shagan</creator><creator>Shetty, Rajath</creator><creator>Arar, Nuri Murat</creator><scope>EVB</scope></search><sort><creationdate>20210617</creationdate><title>NEURAL NETWORK BASED FACIAL ANALYSIS USING FACIAL LANDMARKS AND ASSOCIATED CONFIDENCE VALUES</title><author>Molchanov, Pavlo ; Yadawadkar, Sujay ; Avadhanam, Niranjan ; Puri, Nishant ; Sah, Shagan ; Shetty, Rajath ; Arar, Nuri Murat</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2021182625A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2021</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>Molchanov, Pavlo</creatorcontrib><creatorcontrib>Yadawadkar, Sujay</creatorcontrib><creatorcontrib>Avadhanam, Niranjan</creatorcontrib><creatorcontrib>Puri, Nishant</creatorcontrib><creatorcontrib>Sah, Shagan</creatorcontrib><creatorcontrib>Shetty, Rajath</creatorcontrib><creatorcontrib>Arar, Nuri Murat</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Molchanov, Pavlo</au><au>Yadawadkar, Sujay</au><au>Avadhanam, Niranjan</au><au>Puri, Nishant</au><au>Sah, Shagan</au><au>Shetty, Rajath</au><au>Arar, Nuri Murat</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>NEURAL NETWORK BASED FACIAL ANALYSIS USING FACIAL LANDMARKS AND ASSOCIATED CONFIDENCE VALUES</title><date>2021-06-17</date><risdate>2021</risdate><abstract>Systems and methods for more accurate and robust determination of subject characteristics from an image of the subject. One or more machine learning models receive as input an image of a subject, and output both facial landmarks and associated confidence values. Confidence values represent the degrees to which portions of the subject's face corresponding to those landmarks are occluded, i.e., the amount of uncertainty in the position of each landmark location. These landmark points and their associated confidence values, and/or associated information, may then be input to another set of one or more machine learning models which may output any facial analysis quantity or quantities, such as the subject's gaze direction, head pose, drowsiness state, cognitive load, or distraction state.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_US2021182625A1
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
PHYSICS
title NEURAL NETWORK BASED FACIAL ANALYSIS USING FACIAL LANDMARKS AND ASSOCIATED CONFIDENCE VALUES
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T14%3A28%3A26IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Molchanov,%20Pavlo&rft.date=2021-06-17&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2021182625A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true