DropConn: Dropout Connection Based Random GNNs for Molecular Property Prediction
Recently, molecular data mining has attracted a lot of attention owing to its great application potential in material and drug discovery. However, this mining task faces a challenge posed by the scarcity of labeled molecular graphs. To overcome this challenge, we introduce a novel data augmentation...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on knowledge and data engineering 2024-02, Vol.36 (2), p.1-13 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, molecular data mining has attracted a lot of attention owing to its great application potential in material and drug discovery. However, this mining task faces a challenge posed by the scarcity of labeled molecular graphs. To overcome this challenge, we introduce a novel data augmentation and a semi-supervised confidence-aware consistency regularization training framework for molecular property prediction. The core of our framework is a data augmentation strategy on molecular graphs, named DropConn (Dropout Connection). DropConn generates pseudo molecular graphs by softening the hard connections of chemical bonds (as edges), where the soft weights are calculated from edge features so that the adaptive interactions between different atoms can be incorporated. Besides, to enhance the model's generalization ability, a consistency regularization training strategy is proposed to take full advantage of massive unlabeled data. Furthermore, DropConn can serve as a plugin that can be seamlessly added to many existing models. Extensive experiments under both non-pre-training setting and fine-tuning setting demonstrate that DropConn can obtain superior performance (up to 8.22%) over state-of-the-art methods on molecular property prediction tasks. The code is available at https://github.com/THUDM/DropConn . |
---|---|
ISSN: | 1041-4347 1558-2191 |
DOI: | 10.1109/TKDE.2023.3290032 |