Audio-Based Granularity-Adapted Emotion Classification
This paper introduces a novel framework for combining the strengths of machine-based and human-based emotion classification. Peoples' ability to tell similar emotions apart is known as emotional granularity, which can be high or low, and is measurable. This paper proposes granularity-adapted cl...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on affective computing 2018-04, Vol.9 (2), p.176-190 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper introduces a novel framework for combining the strengths of machine-based and human-based emotion classification. Peoples' ability to tell similar emotions apart is known as emotional granularity, which can be high or low, and is measurable. This paper proposes granularity-adapted classification that can be used as a front-end to drive a recommender, based on emotions from speech. In this context, incorrectly predicted peoples' emotions could lead to poor recommendations, reducing user satisfaction. Instead of identifying a single emotion class, an adapted class is proposed, and is an aggregate of underlying emotion classes chosen based on granularity. In the recommendation context, the adapted class maps to a larger region in valence-arousal space, from which a list of potentially more similar content items is drawn, and recommended to the user. To determine the effectiveness of adapted classes, we measured the emotional granularity of subjects, and for each subject, used their pairwise similarity judgments of emotion to compare the effectiveness of adapted classes versus single emotion classes taken from a baseline system. A customized Euclidean-based similarity metric is used to measure the relative proximity of emotion classes. Results show that granularity-adapted classification can improve the potential similarity by up to 9.6 percent. |
---|---|
ISSN: | 1949-3045 1949-3045 |
DOI: | 10.1109/TAFFC.2016.2598741 |