Faces recognition and facial gender classification using convolutional neural network

Computational power in deep convolutional neural networks has made it possible to have robust classifiers for faces and facial gender for many security issues and computer vision problems. This paper proposes two convolutional neural network (CNN) models for face recognition and facial gender classi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Menoufia Journal of Electronic Engineering Research 2022-06, Vol.31 (2), p.1-10
1. Verfasser: Berbar, Muhammad Abduh
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Computational power in deep convolutional neural networks has made it possible to have robust classifiers for faces and facial gender for many security issues and computer vision problems. This paper proposes two convolutional neural network (CNN) models for face recognition and facial gender classification. The models consist of an image input layer, followed by three blocks of convolutional, normalization, activation, and max-pooling layers, and three fully connected layers. The performance of the proposed CNN solutions is evaluated using five publicly available face datasets. Two greyscale face datasets: Sheffield and AT & T. Three color face datasets, Faces94, Ferret, and Celebrity Face Images from Kaggle. The achieved classification accuracy ranged between 99.0% and 100% on the Faces94, Ferret, Sheffield, and AT&T datasets, and classification accuracy of 93.6% to 95.0% on the Kaggle dataset. The proposed CNN can process and classify a small-size face image 32 × 32-pixel from the Faces94, Sheffield, and AT&T datasets and 100 × 100 pixels from the Ferret and Kaggle datasets. The obtained results prove that the proposed CNN models are an effective solution for face image recognition and facial gender image classification. The proposed model produces competitive accuracy compared to several state-of-the-art methods.
ISSN:1687-1189
2682-3535
1687-1189
DOI:10.21608/mjeer.2022.137937.1056