Triple-Type Feature Extraction for Palmprint Recognition
Palmprint recognition has received tremendous research interests due to its outstanding user-friendliness such as non-invasive and good hygiene properties. Most recent palmprint recognition studies such as deep-learning methods usually learn discriminative features from palmprint images, which usual...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2021-07, Vol.21 (14), p.4896 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Palmprint recognition has received tremendous research interests due to its outstanding user-friendliness such as non-invasive and good hygiene properties. Most recent palmprint recognition studies such as deep-learning methods usually learn discriminative features from palmprint images, which usually require a large number of labeled samples to achieve a reasonable good recognition performance. However, palmprint images are usually limited because it is relative difficult to collect enough palmprint samples, making most existing deep-learning-based methods ineffective. In this paper, we propose a heuristic palmprint recognition method by extracting triple types of palmprint features without requiring any training samples. We first extract the most important inherent features of a palmprint, including the texture, gradient and direction features, and encode them into triple-type feature codes. Then, we use the block-wise histograms of the triple-type feature codes to form the triple feature descriptors for palmprint representation. Finally, we employ a weighted matching-score level fusion to calculate the similarity between two compared palmprint images of triple-type feature descriptors for palmprint recognition. Extensive experimental results on the three widely used palmprint databases clearly show the promising effectiveness of the proposed method. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s21144896 |