Markov Network-Based Unified Classifier for Face Recognition
In this paper, we propose a novel unifying framework using a Markov network to learn the relationships among multiple classifiers. In face recognition, we assume that we have several complementary classifiers available, and assign observation nodes to the features of a query image and hidden nodes t...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on image processing 2015-11, Vol.24 (11), p.4263-4275 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we propose a novel unifying framework using a Markov network to learn the relationships among multiple classifiers. In face recognition, we assume that we have several complementary classifiers available, and assign observation nodes to the features of a query image and hidden nodes to those of gallery images. Under the Markov assumption, we connect each hidden node to its corresponding observation node and the hidden nodes of neighboring classifiers. For each observation-hidden node pair, we collect the set of gallery candidates most similar to the observation instance, and capture the relationship between the hidden nodes in terms of a similarity matrix among the retrieved gallery images. Posterior probabilities in the hidden nodes are computed using the belief propagation algorithm, and we use marginal probability as the new similarity value of the classifier. The novelty of our proposed framework lies in the method that considers classifier dependence using the results of each neighboring classifier. We present the extensive evaluation results for two different protocols, known and unknown image variation tests, using four publicly available databases: 1) the Face Recognition Grand Challenge ver. 2.0; 2) XM2VTS; 3) BANCA; and 4) Multi-PIE. The result shows that our framework consistently yields improved recognition rates in various situations. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2015.2460464 |