Spatial-Temporal Graph Learning with Adversarial Contrastive Adaptation
Spatial-temporal graph learning has emerged as a promising solution for modeling structured spatial-temporal data and learning region representations for various urban sensing tasks such as crime forecasting and traffic flow prediction. However, most existing models are vulnerable to the quality of...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Spatial-temporal graph learning has emerged as a promising solution for
modeling structured spatial-temporal data and learning region representations
for various urban sensing tasks such as crime forecasting and traffic flow
prediction. However, most existing models are vulnerable to the quality of the
generated region graph due to the inaccurate graph-structured information
aggregation schema. The ubiquitous spatial-temporal data noise and
incompleteness in real-life scenarios pose challenges in generating
high-quality region representations. To address this challenge, we propose a
new spatial-temporal graph learning model (GraphST) for enabling effective
self-supervised learning. Our proposed model is an adversarial contrastive
learning paradigm that automates the distillation of crucial multi-view
self-supervised information for robust spatial-temporal graph augmentation. We
empower GraphST to adaptively identify hard samples for better
self-supervision, enhancing the representation discrimination ability and
robustness. In addition, we introduce a cross-view contrastive learning
paradigm to model the inter-dependencies across view-specific region
representations and preserve underlying relation heterogeneity. We demonstrate
the superiority of our proposed GraphST method in various spatial-temporal
prediction tasks on real-life datasets. We release our model implementation via
the link: \url{https://github.com/HKUDS/GraphST}. |
---|---|
DOI: | 10.48550/arxiv.2306.10683 |