Universal model for facial expression detection using convolutional neural network
In today’s world which is digitizing increasingly, facial expressions serve as one of the vital indicators of variety of human emotions. With the advancement of technologies like Artificial Intelligence (AI) systems, even machines can process and understand these facial expressions with wide number...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In today’s world which is digitizing increasingly, facial expressions serve as one of the vital indicators of variety of human emotions. With the advancement of technologies like Artificial Intelligence (AI) systems, even machines can process and understand these facial expressions with wide number of applications across various domains. However, the challenge still lies in accounting the human diversity, which leads to distinct characteristics for each individual. And an attempt of creating a single model to fit all such individuals becomes nearly impractical. To tackle this challenge, an organized approach based on inter country facial distinctions could be a feasible solution. This approach works better within the specific country, but the universality is also required which is again a very big challenge. For this as a solution a collaborative methodology is being proposed. The aim of this paper is to prove that it is possible to create a universal model for detecting facial expressions with higher precision and accuracy by combining various datasets from different nationalities to obtain a single dataset with uniform distribution. This paper does so by using datasets of two countries i.e., JAFFE and CK datasets and generating a convolutional neural network-based model by optimizing the trade-offs between model complexity, training time, and required computational resources. |
---|---|
ISSN: | 0094-243X 1551-7616 |
DOI: | 10.1063/5.0221449 |