Using Automatic Facial Expression Classification for Contents Indexing Based on the Emotional Component

Within the last decade the development of new technologies in the multimedia sector has advanced with stunning pace. Due to the availability of high-capacity mass storage devices at low cost private multimedia libraries containing digital video and audio items have recently gained popularity. Althou...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Kowalik, Uwe, Aoki, Terumasa, Yasuda, Hiroshi
Format: Buchkapitel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Within the last decade the development of new technologies in the multimedia sector has advanced with stunning pace. Due to the availability of high-capacity mass storage devices at low cost private multimedia libraries containing digital video and audio items have recently gained popularity. Although attached meta-data like title, actor’s/actress’ name and creation time eases the task of finding preferred contents, it is still difficult to find a specific part within a movie one enjoyed before by remembering the time code. In this paper we introduce the BROAFERENCE system that provides a solution for the above problem. We propose meta-data creation based on recorded user experience derived from facial expressions containing joy, sadness and anger events as well as interest focus data. In the following the system layout, functionality and conducted experiments for system verification will be introduced to the reader.
ISSN:0302-9743
1611-3349
DOI:10.1007/11802167_53