Local2Global: A distributed approach for scaling representation learning on graphs
We propose a decentralised "local2global"' approach to graph representation learning, that one can a-priori use to scale any embedding technique. Our local2global approach proceeds by first dividing the input graph into overlapping subgraphs (or "patches") and training local...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We propose a decentralised "local2global"' approach to graph representation
learning, that one can a-priori use to scale any embedding technique. Our
local2global approach proceeds by first dividing the input graph into
overlapping subgraphs (or "patches") and training local representations for
each patch independently. In a second step, we combine the local
representations into a globally consistent representation by estimating the set
of rigid motions that best align the local representations using information
from the patch overlaps, via group synchronization. A key distinguishing
feature of local2global relative to existing work is that patches are trained
independently without the need for the often costly parameter synchronization
during distributed training. This allows local2global to scale to large-scale
industrial applications, where the input graph may not even fit into memory and
may be stored in a distributed manner. We apply local2global on data sets of
different sizes and show that our approach achieves a good trade-off between
scale and accuracy on edge reconstruction and semi-supervised classification.
We also consider the downstream task of anomaly detection and show how one can
use local2global to highlight anomalies in cybersecurity networks. |
---|---|
DOI: | 10.48550/arxiv.2201.04729 |