Towards Few-shot Self-explaining Graph Neural Networks
Recent advancements in Graph Neural Networks (GNNs) have spurred an upsurge of research dedicated to enhancing the explainability of GNNs, particularly in critical domains such as medicine. A promising approach is the self-explaining method, which outputs explanations along with predictions. However...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recent advancements in Graph Neural Networks (GNNs) have spurred an upsurge
of research dedicated to enhancing the explainability of GNNs, particularly in
critical domains such as medicine. A promising approach is the self-explaining
method, which outputs explanations along with predictions. However, existing
self-explaining models require a large amount of training data, rendering them
unavailable in few-shot scenarios. To address this challenge, in this paper, we
propose a Meta-learned Self-Explaining GNN (MSE-GNN), a novel framework that
generates explanations to support predictions in few-shot settings. MSE-GNN
adopts a two-stage self-explaining structure, consisting of an explainer and a
predictor. Specifically, the explainer first imitates the attention mechanism
of humans to select the explanation subgraph, whereby attention is naturally
paid to regions containing important characteristics. Subsequently, the
predictor mimics the decision-making process, which makes predictions based on
the generated explanation. Moreover, with a novel meta-training process and a
designed mechanism that exploits task information, MSE-GNN can achieve
remarkable performance on new few-shot tasks. Extensive experimental results on
four datasets demonstrate that MSE-GNN can achieve superior performance on
prediction tasks while generating high-quality explanations compared with
existing methods. The code is publicly available at
https://github.com/jypeng28/MSE-GNN. |
---|---|
DOI: | 10.48550/arxiv.2408.07340 |