An Emotionally Intelligent System for Multimodal Sentiment Classification
Objectives: To develop a multimodal sentiment classification model by analyzing the impact of biological signals and examining the concatenation of various modalities in a marketing scenario. Methods: This paper proposes a new emotionally intelligent system for multimodal sentiment classification. I...
Gespeichert in:
Veröffentlicht in: | Indian journal of science and technology 2024-11, Vol.17 (42), p.4386-4394 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Objectives: To develop a multimodal sentiment classification model by analyzing the impact of biological signals and examining the concatenation of various modalities in a marketing scenario. Methods: This paper proposes a new emotionally intelligent system for multimodal sentiment classification. Initially, a multimodal database is prepared by collecting text, speech, facial expression, posture, and biological signals for each individual in the user-machine interaction scenario. This database is preprocessed to remove unwanted noise or missing values. After preprocessing, the dataset is split into training and testing sets. The training set is then fed into the feature extraction phase which involves different methods to extract various types of features like texture, color, acoustic, linguistic, and biological signals. These features are independently trained by the Multi-Modal Deep Belief Network (MMDBN) model to generate the trained classifier. The trained classifier is later used to classify the test sets as different sentiment classes like positive, negative, or neutral. The effectiveness of the proposed MMDBN model is evaluated in MATLAB 2019b using Interactive Emotional Dyadic Motion Capture Database (IEMOCAP) and Wearable Stress and Affect Detection (WESAD) datasets. Each dataset is divided into 70% for training and 30% for testing. A comparative study is conducted by applying existing sentiment classification models on the datasets to evaluate MMDBN prediction efficiency using metrics like accuracy, precision, recall, and F-Score. Findings: The MMDBN model achieves 80.28% and 81.17% accuracy on IEMOCAP and WESAD databases compared to the existing sentiment classification models like Deep Multi-View Attentive Network (DMVAN), Multi-Channel Multimodal Joint Learning Method (MCMJLM) and Multi-Level Textual-Visual Alignment and Fusion Network (MTVAF) and Attention-Based Multimodal Sentiment Analysis and Emotion Recognition (AMSAER). Novelty: This study introduces a novel emotional intelligence system for multimodal sentiment classification, integrating biological signals and other modalities to enhance accuracy in marketing by combining linguistic, acoustic, visual, and biological data. Keywords: Sentiment classification, Biological signals, Multimodal database, Deep Learning, Deep belief network |
---|---|
ISSN: | 0974-6846 0974-5645 |
DOI: | 10.17485/IJST/v17i42.2349 |