Geometric Transformer for End-to-End Molecule Properties Prediction
Transformers have become methods of choice in many applications thanks to their ability to represent complex interactions between elements. However, extending the Transformer architecture to non-sequential data such as molecules and enabling its training on small datasets remains a challenge. In thi...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Transformers have become methods of choice in many applications thanks to
their ability to represent complex interactions between elements. However,
extending the Transformer architecture to non-sequential data such as molecules
and enabling its training on small datasets remains a challenge. In this work,
we introduce a Transformer-based architecture for molecule property prediction,
which is able to capture the geometry of the molecule. We modify the classical
positional encoder by an initial encoding of the molecule geometry, as well as
a learned gated self-attention mechanism. We further suggest an augmentation
scheme for molecular data capable of avoiding the overfitting induced by the
overparameterized architecture. The proposed framework outperforms the
state-of-the-art methods while being based on pure machine learning solely,
i.e. the method does not incorporate domain knowledge from quantum chemistry
and does not use extended geometric inputs besides the pairwise atomic
distances. |
---|---|
DOI: | 10.48550/arxiv.2110.13721 |