Unsupervised domain adaptation based on adaptive local manifold learning
Transfer learning is known to an effective method dealing with domain shift. When the same task is shared in different domains, it is usually called domain adaptation. The problem of distribution difference is inevitable in domain adaption works. The subspace method can transform the data into a new...
Gespeichert in:
Veröffentlicht in: | Computers & electrical engineering 2022-05, Vol.100, p.107941, Article 107941 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Transfer learning is known to an effective method dealing with domain shift. When the same task is shared in different domains, it is usually called domain adaptation. The problem of distribution difference is inevitable in domain adaption works. The subspace method can transform the data into a new feature representation, which is helpful to reduce the distribution differences. At present, many researchers have made extensive exploration on subspace learning in domain adaptation works. The weakness of many existed domain adaptation methods based on subspace learning either ignores the local manifold information or has the problem of parameter selection in local manifold regularization term which may limit the effectiveness of cross - domain image classification. Therefore, a novel transfer learning method termed unsupervised domain adaptation based on adaptive local manifold learning (UDA-ALML) is proposed in this paper, which is mainly utilized to cross-domain image classification. For the sake of preserving the structure information of original data, the proposed method combines sparse representation, manifold learning and low rank representation to learn the transformation matrix. To be specific, the weight matrix in traditional local manifold regularization term is replaced by the reconstruction coefficient matrix. Large quantities of experiments show that it has a remarkable performance in cross-domain image recognition. |
---|---|
ISSN: | 0045-7906 1879-0755 |
DOI: | 10.1016/j.compeleceng.2022.107941 |