User emotion prediction method and device

The invention provides a user emotion prediction method and device. The method comprises the following steps: receiving interactive context information; Obtaining emotion characteristics; Extracting keywords in the interactive context information; Obtaining respective frequencies and/or weights of t...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
1. Verfasser: CONG YUNDAN
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator CONG YUNDAN
description The invention provides a user emotion prediction method and device. The method comprises the following steps: receiving interactive context information; Obtaining emotion characteristics; Extracting keywords in the interactive context information; Obtaining respective frequencies and/or weights of the keywords; And obtaining the emotional tendency of the user according to the frequencies and/or weights of the keywords. In each embodiment of the invention, the interactive context information comprises text information, image information, log information, video information, voice information andthe like. According to the user emotion prediction method and device, the emotion of the user is predicted by analyzing the interaction context. 提供了一种用户情绪预测方法和装置。该方法包括:接收交互上下文信息;获取情绪特征;提取交互上下文信息中的关键词;获取关键词各自的频次和/或权重;以及根据这些关键词的频次和/或权重,获取用户的情绪倾向。本发明的各个实施例中,交互上下文信息包括文本信息、图像信息、日志信息、视频信息、语音信息等等。本发明的用户情绪预测方法和装置通过纳入对交互上下文的分析,来对用户的情绪进行预测。
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN109933782A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN109933782A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN109933782A3</originalsourceid><addsrcrecordid>eNrjZNAMLU4tUkjNzS_JzM9TKChKTclMBjNzU0sy8lMUEvNSFFJSyzKTU3kYWNMSc4pTeaE0N4Oim2uIs4duakF-fGpxQWJyal5qSbyzn6GBpaWxsbmFkaMxMWoAAZoosw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>User emotion prediction method and device</title><source>esp@cenet</source><creator>CONG YUNDAN</creator><creatorcontrib>CONG YUNDAN</creatorcontrib><description>The invention provides a user emotion prediction method and device. The method comprises the following steps: receiving interactive context information; Obtaining emotion characteristics; Extracting keywords in the interactive context information; Obtaining respective frequencies and/or weights of the keywords; And obtaining the emotional tendency of the user according to the frequencies and/or weights of the keywords. In each embodiment of the invention, the interactive context information comprises text information, image information, log information, video information, voice information andthe like. According to the user emotion prediction method and device, the emotion of the user is predicted by analyzing the interaction context. 提供了一种用户情绪预测方法和装置。该方法包括:接收交互上下文信息;获取情绪特征;提取交互上下文信息中的关键词;获取关键词各自的频次和/或权重;以及根据这些关键词的频次和/或权重,获取用户的情绪倾向。本发明的各个实施例中,交互上下文信息包括文本信息、图像信息、日志信息、视频信息、语音信息等等。本发明的用户情绪预测方法和装置通过纳入对交互上下文的分析,来对用户的情绪进行预测。</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2019</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20190625&amp;DB=EPODOC&amp;CC=CN&amp;NR=109933782A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76290</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20190625&amp;DB=EPODOC&amp;CC=CN&amp;NR=109933782A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>CONG YUNDAN</creatorcontrib><title>User emotion prediction method and device</title><description>The invention provides a user emotion prediction method and device. The method comprises the following steps: receiving interactive context information; Obtaining emotion characteristics; Extracting keywords in the interactive context information; Obtaining respective frequencies and/or weights of the keywords; And obtaining the emotional tendency of the user according to the frequencies and/or weights of the keywords. In each embodiment of the invention, the interactive context information comprises text information, image information, log information, video information, voice information andthe like. According to the user emotion prediction method and device, the emotion of the user is predicted by analyzing the interaction context. 提供了一种用户情绪预测方法和装置。该方法包括:接收交互上下文信息;获取情绪特征;提取交互上下文信息中的关键词;获取关键词各自的频次和/或权重;以及根据这些关键词的频次和/或权重,获取用户的情绪倾向。本发明的各个实施例中,交互上下文信息包括文本信息、图像信息、日志信息、视频信息、语音信息等等。本发明的用户情绪预测方法和装置通过纳入对交互上下文的分析,来对用户的情绪进行预测。</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2019</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZNAMLU4tUkjNzS_JzM9TKChKTclMBjNzU0sy8lMUEvNSFFJSyzKTU3kYWNMSc4pTeaE0N4Oim2uIs4duakF-fGpxQWJyal5qSbyzn6GBpaWxsbmFkaMxMWoAAZoosw</recordid><startdate>20190625</startdate><enddate>20190625</enddate><creator>CONG YUNDAN</creator><scope>EVB</scope></search><sort><creationdate>20190625</creationdate><title>User emotion prediction method and device</title><author>CONG YUNDAN</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN109933782A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2019</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>CONG YUNDAN</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>CONG YUNDAN</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>User emotion prediction method and device</title><date>2019-06-25</date><risdate>2019</risdate><abstract>The invention provides a user emotion prediction method and device. The method comprises the following steps: receiving interactive context information; Obtaining emotion characteristics; Extracting keywords in the interactive context information; Obtaining respective frequencies and/or weights of the keywords; And obtaining the emotional tendency of the user according to the frequencies and/or weights of the keywords. In each embodiment of the invention, the interactive context information comprises text information, image information, log information, video information, voice information andthe like. According to the user emotion prediction method and device, the emotion of the user is predicted by analyzing the interaction context. 提供了一种用户情绪预测方法和装置。该方法包括:接收交互上下文信息;获取情绪特征;提取交互上下文信息中的关键词;获取关键词各自的频次和/或权重;以及根据这些关键词的频次和/或权重,获取用户的情绪倾向。本发明的各个实施例中,交互上下文信息包括文本信息、图像信息、日志信息、视频信息、语音信息等等。本发明的用户情绪预测方法和装置通过纳入对交互上下文的分析,来对用户的情绪进行预测。</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN109933782A
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
PHYSICS
title User emotion prediction method and device
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T21%3A27%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=CONG%20YUNDAN&rft.date=2019-06-25&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN109933782A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true