Multi-View Attributed Graph Clustering

Multi-view graph clustering has been intensively investigated during the past years. However, existing methods are still limited in two main aspects. On the one hand, most of them can not deal with data that have both attributes and graphs. Nowadays, multi-view attributed graph data are ubiquitous a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on knowledge and data engineering 2023-02, Vol.35 (2), p.1872-1880
Hauptverfasser: Lin, Zhiping, Kang, Zhao, Zhang, Lizong, Tian, Ling
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Multi-view graph clustering has been intensively investigated during the past years. However, existing methods are still limited in two main aspects. On the one hand, most of them can not deal with data that have both attributes and graphs. Nowadays, multi-view attributed graph data are ubiquitous and the need for effective clustering methods is growing. On the other hand, many state-of-the-art algorithms are either shallow or deep models. Shallow methods may seriously restrict their capacity for modeling complex data, while deep approaches often involve large number of parameters and are expensive to train in terms of running time and space needed. In this paper, we propose a novel multi-view attributed graph clustering (MAGC) framework, which exploits both node attributes and graphs. Our novelty lies in three aspects. First, instead of deep neural networks, we apply a graph filtering technique to achieve a smooth node representation. Second, the original graph could be noisy or incomplete and is not directly applicable, thus we learn a consensus graph from data by considering the heterogeneous views. Third, high-order relations are explored in a flexible way by designing a new regularizer. Extensive experiments demonstrate the superiority of our method in terms of effectiveness and efficiency.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2021.3101227