Discriminating classes collapsing for Globality and Locality Preserving Projections

In this paper, a novel approach, namely Globality and Locality Preserving Projections (GLPP), is proposed in the study of dimensionality reduction. The method is designed to combine the ideas behind Locality Preserving (LP), Discriminating Power (DP) and Maximally Collapsing Metric Learning (MCML),...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Wei Wang, Baogang Hu, Zengfu Wang
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, a novel approach, namely Globality and Locality Preserving Projections (GLPP), is proposed in the study of dimensionality reduction. The method is designed to combine the ideas behind Locality Preserving (LP), Discriminating Power (DP) and Maximally Collapsing Metric Learning (MCML), resulting in a unified model. Several distinguished features are obtained from the integration design. First, the method is able to take into account both global and local information of the data set. We introduce a new formula for calculating the conditional probabilities, which can remove the locality distortions from MCML. Second, discrimination information is applied so that a projection matrix is formed which can collapse all data points of the same class closer together, while pushing points of different classes further away. Third, the proposed method guarantees a supervised convex algorithm, which is a critical feature in data processing. Furthermore on this concern, GLPP is mapped to a Graphics Processor Unit (GPU) architecture in the implementation to be appropriate for large scale data sets. Several numerical studies are conducted on a variety of data sets. The numerical results confirm that GLPP consistently outperforms most up-to-date methods, allowing high classification accuracy, good visualization and sharply decreased consuming time.
ISSN:2161-4393
2161-4407
DOI:10.1109/IJCNN.2012.6252372