SYSTEM AND METHOD FOR EXTRACTING HIDDEN CUES IN INTERACTIVE COMMUNICATIONS

Disclosed herein are system, method, and computer program product embodiments for machine learning systems to process interactive communications between at least two participants. Speech and text, within the interactive communications, are analyzed using machine learning classifiers to extract proso...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Amtrup, Jan, Can, Aysu Ezen
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Amtrup, Jan
Can, Aysu Ezen
description Disclosed herein are system, method, and computer program product embodiments for machine learning systems to process interactive communications between at least two participants. Speech and text, within the interactive communications, are analyzed using machine learning classifiers to extract prosodic, semantic and key phrase cues located within the interactive communications to identify changes to emotion, sentiments and key phrases. A summary of the interactive communications between a first participant and a second participant is generated at least, in-part, based on the extracted prosodic, semantic and key phrase cues and the summary is highlighted based on any of the changes to emotion, the sentiments or the key phrases.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2023298615A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2023298615A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2023298615A13</originalsourceid><addsrcrecordid>eNrjZPAKjgwOcfVVcPRzUfB1DfHwd1Fw8w9ScI0ICXJ0DvH0c1fw8HRxcfVTcA51DVbw9AOiEFewVJirgrO_r2-on6ezY4inv18wDwNrWmJOcSovlOZmUHZzDXH20E0tyI9PLS5ITE7NSy2JDw02MjAyNrK0MDM0dTQ0Jk4VACFBLoQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>SYSTEM AND METHOD FOR EXTRACTING HIDDEN CUES IN INTERACTIVE COMMUNICATIONS</title><source>esp@cenet</source><creator>Amtrup, Jan ; Can, Aysu Ezen</creator><creatorcontrib>Amtrup, Jan ; Can, Aysu Ezen</creatorcontrib><description>Disclosed herein are system, method, and computer program product embodiments for machine learning systems to process interactive communications between at least two participants. Speech and text, within the interactive communications, are analyzed using machine learning classifiers to extract prosodic, semantic and key phrase cues located within the interactive communications to identify changes to emotion, sentiments and key phrases. A summary of the interactive communications between a first participant and a second participant is generated at least, in-part, based on the extracted prosodic, semantic and key phrase cues and the summary is highlighted based on any of the changes to emotion, the sentiments or the key phrases.</description><language>eng</language><subject>ACOUSTICS ; CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; MUSICAL INSTRUMENTS ; PHYSICS ; SPEECH ANALYSIS OR SYNTHESIS ; SPEECH OR AUDIO CODING OR DECODING ; SPEECH OR VOICE PROCESSING ; SPEECH RECOGNITION</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230921&amp;DB=EPODOC&amp;CC=US&amp;NR=2023298615A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,778,883,25551,76302</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230921&amp;DB=EPODOC&amp;CC=US&amp;NR=2023298615A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Amtrup, Jan</creatorcontrib><creatorcontrib>Can, Aysu Ezen</creatorcontrib><title>SYSTEM AND METHOD FOR EXTRACTING HIDDEN CUES IN INTERACTIVE COMMUNICATIONS</title><description>Disclosed herein are system, method, and computer program product embodiments for machine learning systems to process interactive communications between at least two participants. Speech and text, within the interactive communications, are analyzed using machine learning classifiers to extract prosodic, semantic and key phrase cues located within the interactive communications to identify changes to emotion, sentiments and key phrases. A summary of the interactive communications between a first participant and a second participant is generated at least, in-part, based on the extracted prosodic, semantic and key phrase cues and the summary is highlighted based on any of the changes to emotion, the sentiments or the key phrases.</description><subject>ACOUSTICS</subject><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>MUSICAL INSTRUMENTS</subject><subject>PHYSICS</subject><subject>SPEECH ANALYSIS OR SYNTHESIS</subject><subject>SPEECH OR AUDIO CODING OR DECODING</subject><subject>SPEECH OR VOICE PROCESSING</subject><subject>SPEECH RECOGNITION</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZPAKjgwOcfVVcPRzUfB1DfHwd1Fw8w9ScI0ICXJ0DvH0c1fw8HRxcfVTcA51DVbw9AOiEFewVJirgrO_r2-on6ezY4inv18wDwNrWmJOcSovlOZmUHZzDXH20E0tyI9PLS5ITE7NSy2JDw02MjAyNrK0MDM0dTQ0Jk4VACFBLoQ</recordid><startdate>20230921</startdate><enddate>20230921</enddate><creator>Amtrup, Jan</creator><creator>Can, Aysu Ezen</creator><scope>EVB</scope></search><sort><creationdate>20230921</creationdate><title>SYSTEM AND METHOD FOR EXTRACTING HIDDEN CUES IN INTERACTIVE COMMUNICATIONS</title><author>Amtrup, Jan ; Can, Aysu Ezen</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2023298615A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2023</creationdate><topic>ACOUSTICS</topic><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>MUSICAL INSTRUMENTS</topic><topic>PHYSICS</topic><topic>SPEECH ANALYSIS OR SYNTHESIS</topic><topic>SPEECH OR AUDIO CODING OR DECODING</topic><topic>SPEECH OR VOICE PROCESSING</topic><topic>SPEECH RECOGNITION</topic><toplevel>online_resources</toplevel><creatorcontrib>Amtrup, Jan</creatorcontrib><creatorcontrib>Can, Aysu Ezen</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Amtrup, Jan</au><au>Can, Aysu Ezen</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>SYSTEM AND METHOD FOR EXTRACTING HIDDEN CUES IN INTERACTIVE COMMUNICATIONS</title><date>2023-09-21</date><risdate>2023</risdate><abstract>Disclosed herein are system, method, and computer program product embodiments for machine learning systems to process interactive communications between at least two participants. Speech and text, within the interactive communications, are analyzed using machine learning classifiers to extract prosodic, semantic and key phrase cues located within the interactive communications to identify changes to emotion, sentiments and key phrases. A summary of the interactive communications between a first participant and a second participant is generated at least, in-part, based on the extracted prosodic, semantic and key phrase cues and the summary is highlighted based on any of the changes to emotion, the sentiments or the key phrases.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_US2023298615A1
source esp@cenet
subjects ACOUSTICS
CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
MUSICAL INSTRUMENTS
PHYSICS
SPEECH ANALYSIS OR SYNTHESIS
SPEECH OR AUDIO CODING OR DECODING
SPEECH OR VOICE PROCESSING
SPEECH RECOGNITION
title SYSTEM AND METHOD FOR EXTRACTING HIDDEN CUES IN INTERACTIVE COMMUNICATIONS
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T21%3A26%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Amtrup,%20Jan&rft.date=2023-09-21&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2023298615A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true