A REAL TIME FACE RECOGNITION SYSTEM USING ALEXNET DEEP CONVOLUTIONAL NETWORK TRANSFER LEARNING MODEL

In the field of deep learning, facial recognition belongs to the computer vision category. In various applications such as access control system, security, attendance management etc., it has been widely used for authentication and identification purposes. In deep learning, transfer learning is a met...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of engineering studies and research 2021-10, Vol.27 (2), p.82-88
Hauptverfasser: OMOTOSHO, LAWRENCE O., OGUNDOYIN, IBRAHIM K., OYENIYI, JOSHUA O., OYENIRAN, OLUWASHINA A.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the field of deep learning, facial recognition belongs to the computer vision category. In various applications such as access control system, security, attendance management etc., it has been widely used for authentication and identification purposes. In deep learning, transfer learning is a method of using a neural network model that is first trained on a problem similar to the problem that is being solved. The most commonly used face recognition methods are mainly based on template matching, geometric features based, algebraic and deep learning method. The advantage of template matching is that it is easy to implement, and the disadvantage is that it is difficult to deal with the pose and scale changes effectively. The most important issue, regardless of the method used in the face recognition system, is dimensionality and computational complexity, especially when operating on large databases. In this paper, we applied a transfer learning model based on AlexNet Deep convolutional network to develop a real time face recognition system that has a good robustness to face pose and illumination, reduce dimensionality, complexity and improved recognition accuracy. The system has a recognition accuracy of 98.95 %.
ISSN:2068-7559
2344-4932
DOI:10.29081/jesr.v27i2.277