Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection

Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. Howe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Si-Bao Chen, Hai-Xian Wang, Xing-Yi Zhang, Bin Luo
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. However, in many applications, labeled data are very limited while unlabeled data can be easily obtained. The estimation of divergences of class pairs is unstable using inadequate labeled data. To take advantage of unlabeled data for subspace selection, semi-supervised MGMD (SSMGMD) is proposed using graph Laplacian as normalization. Quasi-Newton method is adopted to solve the optimization problem. Experiments on synthetic data and real image data show the validity of SSMGMD.
DOI:10.1109/FSKD.2011.6019712