Locality-Aware Discriminative Subspace Learning for Image Classification
Projection learning is an effective and widely used technique for extracting discriminative features for pattern recognition and classification. In projection learning, it is essential to preserve the global and local structure of the data while extracting discriminative features. However, transform...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2022, Vol.71, p.1-14 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Projection learning is an effective and widely used technique for extracting discriminative features for pattern recognition and classification. In projection learning, it is essential to preserve the global and local structure of the data while extracting discriminative features. However, transforming the source data directly to a target, i.e., the strict binary label matrix, using a projection matrix may result in the loss of some intrinsic information. We propose a locality-aware discriminative subspace learning (LADSL) method to address these limitations. In LADSL, the original data are transformed into a latent space instead of a restrictive label space. The latent space seamlessly integrates the original visual features and the class labels to improve the classification performance. The projection matrix and classification parameters are jointly optimized to supervise the discriminative subspace learning. In addition, LADSL exploits the adaptive local structure to preserve the nearest neighbor relationship among the data samples while learning more projections to achieve superior classification performance. Experiments have been carried out on various datasets for face and object recognition, and the results achieved are compared with the state-of-the-art methods to validate the effectiveness of the proposed LADSL method. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2022.3187735 |