Rethinking Independent Cross-Entropy Loss For Graph-Structured Data
ICML 2024 Graph neural networks (GNNs) have exhibited prominent performance in learning graph-structured data. Considering node classification task, based on the i.i.d assumption among node labels, the traditional supervised learning simply sums up cross-entropy losses of the independent training no...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | ICML 2024 Graph neural networks (GNNs) have exhibited prominent performance in learning
graph-structured data. Considering node classification task, based on the i.i.d
assumption among node labels, the traditional supervised learning simply sums
up cross-entropy losses of the independent training nodes and applies the
average loss to optimize GNNs' weights. But different from other data formats,
the nodes are naturally connected. It is found that the independent
distribution modeling of node labels restricts GNNs' capability to generalize
over the entire graph and defend adversarial attacks. In this work, we propose
a new framework, termed joint-cluster supervised learning, to model the joint
distribution of each node with its corresponding cluster. We learn the joint
distribution of node and cluster labels conditioned on their representations,
and train GNNs with the obtained joint loss. In this way, the data-label
reference signals extracted from the local cluster explicitly strengthen the
discrimination ability on the target node. The extensive experiments
demonstrate that our joint-cluster supervised learning can effectively bolster
GNNs' node classification accuracy. Furthermore, being benefited from the
reference signals which may be free from spiteful interference, our learning
paradigm significantly protects the node classification from being affected by
the adversarial attack. |
---|---|
DOI: | 10.48550/arxiv.2405.15564 |