Overcoming the limitations of conventional deep learning methods for gender classification and age prediction with transfer learning approach

With an increase in the online picture database, the need for gender and age classification has become an active area of research. It helps in detection and investigation purposes. With the emergence of computer vision, deep learning (DL) is being applied in research due to its wide range of real-wo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Jain, Bhawna, Anand, Khushi, Priya, Sonali, Bhargava, Khushi
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With an increase in the online picture database, the need for gender and age classification has become an active area of research. It helps in detection and investigation purposes. With the emergence of computer vision, deep learning (DL) is being applied in research due to its wide range of real-world applications in image classification. This paper investigates the performance of two DL models, the Convolution Neural Network (CNN) and the Residual Neural Network (ResNet18) model, for analyzing their efficacy in predicting gender and age classification tasks using facial images. The Adience dataset, a widely used benchmark dataset for these tasks, is used to train and test the respective models. This study uses a 3-fold empirical model evaluation on CNN and ResNet18 model, and results were found on three main assessment criteria-age accuracy, 1-off accuracy, and gender detecting accuracy. The ResNet18 model slightly outperformed the CNN model in both functions by showing a 94% exact match on age classification and 92% on gender prediction. This paper highlights the effectiveness of DL models in gender and age classification tasks using facial images and the importance of the model architecture and design in achieving high accuracy and robustness.
ISSN:0094-243X
1551-7616
DOI:10.1063/5.0229634