Graph structure learning layer and its graph convolution clustering application
To learn the embedding representation of graph structure data corrupted by noise and outliers, existing graph structure learning networks usually follow the two-step paradigm, i.e., constructing a “good” graph structure and achieving the message passing for signals supported on the learned graph. Ho...
Gespeichert in:
Veröffentlicht in: | Neural networks 2023-08, Vol.165, p.1010-1020 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To learn the embedding representation of graph structure data corrupted by noise and outliers, existing graph structure learning networks usually follow the two-step paradigm, i.e., constructing a “good” graph structure and achieving the message passing for signals supported on the learned graph. However, the data corrupted by noise may make the learned graph structure unreliable. In this paper, we propose an adaptive graph convolutional clustering network that alternatively adjusts the graph structure and node representation layer-by-layer with back-propagation. Specifically, we design a Graph Structure Learning layer before each Graph Convolutional layer to learn the sparse graph structure from the node representations, where the graph structure is implicitly determined by the solution to the optimal self-expression problem. This is one of the first works that uses an optimization process as a Graph Network layer, which is obviously different from the function operation in traditional deep learning layers. An efficient iterative optimization algorithm is given to solve the optimal self-expression problem in the Graph Structure Learning layer. Experimental results show that the proposed method can effectively defend the negative effects of inaccurate graph structures. The code is available at https://github.com/HeXiax/SSGNN. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/j.neunet.2023.06.024 |