Loose L sub(1/2) regularised sparse representation for face recognition

Sparse representation (or sparse coding) has been applied to deal with frontal face recognition. Two representative methods are the sparse representation-based classification (SRC) and the collaborative representation-based classification (CRC), in which the query face image is represented by a spar...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET computer vision 2015-04, Vol.9 (2), p.251-258
Hauptverfasser: Zhong, Dexing, Xie, Zichao, Li, Yanrui, Han, Jiuqiang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Sparse representation (or sparse coding) has been applied to deal with frontal face recognition. Two representative methods are the sparse representation-based classification (SRC) and the collaborative representation-based classification (CRC), in which the query face image is represented by a sparse linear combination of all the training samples. The difference between SRC and CRC is that the L sub(1)-norm constraint of coding is employed in the former to guarantee the sparse property, while the L sub(2)-norm constraint is utilised in the latter. In this paper, we propose a novel loose L sub(1/2) regularised sparse representation (SR) for face recognition, named L sub(1/2) classification (LHC), which is inspired by L sub(1/2) regularisation. Additionally, an iterative Tikhonov regularisation (ITR) is proposed to solve LHC efficiently compared with the original algorithm. Using ITR, the balance between the collaborative representation (CR) and the SR can be tuned by the iterations. Attributed to the sparser L sub(1/2) regularisation and the iterative solution mechanism, a better performance can be achieved by LHC. Extensive experiments on three benchmark face databases demonstrated that LHC is more effective than the state-of-the-art SR-based methods in dealing with frontal face recognition.
ISSN:1751-9632
1751-9640
DOI:10.1049/iet-cvi.2014.0114