Adaptive Graph Completion Based Incomplete Multi-View Clustering

In real-world applications, it is often that the collected multi-view data are incomplete, i.e., some views of samples are absent. Existing clustering methods for incomplete multi-view data all focus on obtaining a common representation or graph from the available views but neglect the hidden inform...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2021, Vol.23, p.2493-2504
Hauptverfasser: Wen, Jie, Yan, Ke, Zhang, Zheng, Xu, Yong, Wang, Junqian, Fei, Lunke, Zhang, Bob
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In real-world applications, it is often that the collected multi-view data are incomplete, i.e., some views of samples are absent. Existing clustering methods for incomplete multi-view data all focus on obtaining a common representation or graph from the available views but neglect the hidden information of missing views and information imbalance of different views. To solve these problems, a novel method, called adaptive graph completion based incomplete multi-view clustering (AGC_IMC), is proposed in this paper. Specifically, AGC_IMC develops a joint framework for graph completion and consensus representation learning, which mainly contains three components, i.e., within-view preservation, between-view inferring, and consensus representation learning. To reduce the negative influence of information imbalance, AGC_IMC introduces some adaptive weights to balance the importance of different views during the consensus representation learning. Importantly, AGC_IMC has the potential to recover the similarity graphs of all views with the optimal cluster structure, which encourages it to obtain a more discriminative consensus representation. Experimental results on five well-known datasets show that AGC_IMC significantly outperforms the state-of-the-art methods.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2020.3013408