Think out Loud: Emotion Deducing Explanation in Dialogues
Humans convey emotions through daily dialogues, making emotion understanding a crucial step of affective intelligence. To understand emotions in dialogues, machines are asked to recognize the emotion for an utterance (Emotion Recognition in Dialogues, ERD); based on the emotion, then find causal utt...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Humans convey emotions through daily dialogues, making emotion understanding
a crucial step of affective intelligence. To understand emotions in dialogues,
machines are asked to recognize the emotion for an utterance (Emotion
Recognition in Dialogues, ERD); based on the emotion, then find causal
utterances for the emotion (Emotion Cause Extraction in Dialogues, ECED). The
setting of the two tasks requires first ERD and then ECED, ignoring the mutual
complement between emotion and cause. To fix this, some new tasks are proposed
to extract them simultaneously. Although the current research on these tasks
has excellent achievements, simply identifying emotion-related factors by
classification modeling lacks realizing the specific thinking process of causes
stimulating the emotion in an explainable way. This thinking process especially
reflected in the reasoning ability of Large Language Models (LLMs) is
under-explored. To this end, we propose a new task "Emotion Deducing
Explanation in Dialogues" (EDEN). EDEN recognizes emotion and causes in an
explicitly thinking way. That is, models need to generate an explanation text,
which first summarizes the causes; analyzes the inner activities of the
speakers triggered by the causes using common sense; then guesses the emotion
accordingly. To support the study of EDEN, based on the existing resources in
ECED, we construct two EDEN datasets by human effort. We further evaluate
different models on EDEN and find that LLMs are more competent than
conventional PLMs. Besides, EDEN can help LLMs achieve better recognition of
emotions and causes, which explores a new research direction of explainable
emotion understanding in dialogues. |
---|---|
DOI: | 10.48550/arxiv.2406.04758 |