On the use of ear and profile faces for distinguishing identical twins and nontwins

This study aims to measure the efficiency of ear and profile face in distinguishing identical twins under identification and verification modes. In addition, to distinguish identical twins by ear and profile face separately, we propose to fuse these traits with all possible binary combinations of le...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems 2020-02, Vol.37 (1), p.n/a, Article 12389
Hauptverfasser: Toygar, Önsen, Alqaralleh, Esraa, Afaneh, Ayman
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study aims to measure the efficiency of ear and profile face in distinguishing identical twins under identification and verification modes. In addition, to distinguish identical twins by ear and profile face separately, we propose to fuse these traits with all possible binary combinations of left ear, left profile face, right ear, and right profile face. Fusion is implemented by score‐level fusion and decision‐level fusion techniques in the proposed method. Additionally, feature‐level fusion is used for comparison. All experiments in this paper are also implemented on nontwins individuals, and the recognition performance of twins and nontwins are compared. Local binary patterns, local phase quantization, and binarized statistical image features approaches are used as texture‐based descriptors for feature extraction process. Images under controlled and uncontrolled lighting are tested. Ear and profile images from ND‐TWINS‐2009‐2010 dataset are used in the experiments. The experimental results show that the proposed method is more accurate and reliable than using ear or profile face images separately. The performance of the proposed method for recognizing identical twins as recognition rate is 100% and 99.45%, and equal error rates are 0.54% and 1.63% in controlled and uncontrolled illumination conditions, respectively.
ISSN:0266-4720
1468-0394
DOI:10.1111/exsy.12389