Predicting Chemical Properties using Self-Attention Multi-task Learning based on SMILES Representation
In the computational prediction of chemical compound properties, molecular descriptors and fingerprints encoded to low dimensional vectors are used. The selection of proper molecular descriptors and fingerprints is both important and challenging as the performance of such models is highly dependent...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the computational prediction of chemical compound properties, molecular
descriptors and fingerprints encoded to low dimensional vectors are used. The
selection of proper molecular descriptors and fingerprints is both important
and challenging as the performance of such models is highly dependent on
descriptors. To overcome this challenge, natural language processing models
that utilize simplified molecular input line-entry system as input were
studied, and several transformer-variant models achieved superior results when
compared with conventional methods. In this study, we explored the structural
differences of the transformer-variant model and proposed a new self-attention
based model. The representation learning performance of the self-attention
module was evaluated in a multi-task learning environment using imbalanced
chemical datasets. The experiment results showed that our model achieved
competitive outcomes on several benchmark datasets. The source code of our
experiment is available at https://github.com/arwhirang/sa-mtl and the dataset
is available from the same URL. |
---|---|
DOI: | 10.48550/arxiv.2010.11272 |