Thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition
The invention discloses a thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition, and relates to the technical field of facial expression recognition. According to the method, a visual feature and facial feature point position feature extraction modul...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Patent |
Sprache: | chi ; eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | WANG CHENZHI XIA LIYANG ZHOU LIJIAN LIU JIAQI SUN JIE ZHANG ZHANWANG |
description | The invention discloses a thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition, and relates to the technical field of facial expression recognition. According to the method, a visual feature and facial feature point position feature extraction module is designed in a model; a convolutional neural network is constructed by using a series of grouped convolution to extract facial visual features, and a feature point extraction network formed by a full connection layer is used to extract facial feature points. Moreover, multi-source information fusion is carried out based on the extracted data set, a feature level fusion mode is adopted to construct an expression recognition model based on visual and position feature fusion, and the model splices visual features and feature point position features to form final features of expression recognition. According to the method, the thermal and acoustic environment state comfort is evaluated through facial micro-expressi |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN117894054A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN117894054A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN117894054A3</originalsourceid><addsrcrecordid>eNqNjMEKgkAURd20iOofXh8gJBnVMqRo1cq9vMarDjjvycwofX4KfUCrw4Vzzzppyw7ecU8sNbHRMURrCDJZr-IgkYy6Rn0kTNyPHK0KOcROa3pzQE3zbtjYOeGs8ZriM3iEsHgeRluxy2ebrBruA3Y_bpL9414WzxSDVggDGwhiVbyy7Hy55odTfjv-43wBMj9A8w</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition</title><source>esp@cenet</source><creator>WANG CHENZHI ; XIA LIYANG ; ZHOU LIJIAN ; LIU JIAQI ; SUN JIE ; ZHANG ZHANWANG</creator><creatorcontrib>WANG CHENZHI ; XIA LIYANG ; ZHOU LIJIAN ; LIU JIAQI ; SUN JIE ; ZHANG ZHANWANG</creatorcontrib><description>The invention discloses a thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition, and relates to the technical field of facial expression recognition. According to the method, a visual feature and facial feature point position feature extraction module is designed in a model; a convolutional neural network is constructed by using a series of grouped convolution to extract facial visual features, and a feature point extraction network formed by a full connection layer is used to extract facial feature points. Moreover, multi-source information fusion is carried out based on the extracted data set, a feature level fusion mode is adopted to construct an expression recognition model based on visual and position feature fusion, and the model splices visual features and feature point position features to form final features of expression recognition. According to the method, the thermal and acoustic environment state comfort is evaluated through facial micro-expressi</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20240416&DB=EPODOC&CC=CN&NR=117894054A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76290</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20240416&DB=EPODOC&CC=CN&NR=117894054A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>WANG CHENZHI</creatorcontrib><creatorcontrib>XIA LIYANG</creatorcontrib><creatorcontrib>ZHOU LIJIAN</creatorcontrib><creatorcontrib>LIU JIAQI</creatorcontrib><creatorcontrib>SUN JIE</creatorcontrib><creatorcontrib>ZHANG ZHANWANG</creatorcontrib><title>Thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition</title><description>The invention discloses a thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition, and relates to the technical field of facial expression recognition. According to the method, a visual feature and facial feature point position feature extraction module is designed in a model; a convolutional neural network is constructed by using a series of grouped convolution to extract facial visual features, and a feature point extraction network formed by a full connection layer is used to extract facial feature points. Moreover, multi-source information fusion is carried out based on the extracted data set, a feature level fusion mode is adopted to construct an expression recognition model based on visual and position feature fusion, and the model splices visual features and feature point position features to form final features of expression recognition. According to the method, the thermal and acoustic environment state comfort is evaluated through facial micro-expressi</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNjMEKgkAURd20iOofXh8gJBnVMqRo1cq9vMarDjjvycwofX4KfUCrw4Vzzzppyw7ecU8sNbHRMURrCDJZr-IgkYy6Rn0kTNyPHK0KOcROa3pzQE3zbtjYOeGs8ZriM3iEsHgeRluxy2ebrBruA3Y_bpL9414WzxSDVggDGwhiVbyy7Hy55odTfjv-43wBMj9A8w</recordid><startdate>20240416</startdate><enddate>20240416</enddate><creator>WANG CHENZHI</creator><creator>XIA LIYANG</creator><creator>ZHOU LIJIAN</creator><creator>LIU JIAQI</creator><creator>SUN JIE</creator><creator>ZHANG ZHANWANG</creator><scope>EVB</scope></search><sort><creationdate>20240416</creationdate><title>Thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition</title><author>WANG CHENZHI ; XIA LIYANG ; ZHOU LIJIAN ; LIU JIAQI ; SUN JIE ; ZHANG ZHANWANG</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN117894054A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>WANG CHENZHI</creatorcontrib><creatorcontrib>XIA LIYANG</creatorcontrib><creatorcontrib>ZHOU LIJIAN</creatorcontrib><creatorcontrib>LIU JIAQI</creatorcontrib><creatorcontrib>SUN JIE</creatorcontrib><creatorcontrib>ZHANG ZHANWANG</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>WANG CHENZHI</au><au>XIA LIYANG</au><au>ZHOU LIJIAN</au><au>LIU JIAQI</au><au>SUN JIE</au><au>ZHANG ZHANWANG</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition</title><date>2024-04-16</date><risdate>2024</risdate><abstract>The invention discloses a thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition, and relates to the technical field of facial expression recognition. According to the method, a visual feature and facial feature point position feature extraction module is designed in a model; a convolutional neural network is constructed by using a series of grouped convolution to extract facial visual features, and a feature point extraction network formed by a full connection layer is used to extract facial feature points. Moreover, multi-source information fusion is carried out based on the extracted data set, a feature level fusion mode is adopted to construct an expression recognition model based on visual and position feature fusion, and the model splices visual features and feature point position features to form final features of expression recognition. According to the method, the thermal and acoustic environment state comfort is evaluated through facial micro-expressi</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | chi ; eng |
recordid | cdi_epo_espacenet_CN117894054A |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING PHYSICS |
title | Thermal and acoustic environment comfort evaluation method based on facial micro-expression recognition |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-03T01%3A54%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=WANG%20CHENZHI&rft.date=2024-04-16&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN117894054A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |