Drug–target interaction predication via multi-channel graph neural networks
Abstract Drug–target interaction (DTI) is an important step in drug discovery. Although there are many methods for predicting drug targets, these methods have limitations in using discrete or manual feature representations. In recent years, deep learning methods have been used to predict DTIs to imp...
Gespeichert in:
Veröffentlicht in: | Briefings in bioinformatics 2022-01, Vol.23 (1) |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Abstract
Drug–target interaction (DTI) is an important step in drug discovery. Although there are many methods for predicting drug targets, these methods have limitations in using discrete or manual feature representations. In recent years, deep learning methods have been used to predict DTIs to improve these defects. However, most of the existing deep learning methods lack the fusion of topological structure and semantic information in DPP representation learning process. Besides, when learning the DPP node representation in the DPP network, the different influences between neighboring nodes are ignored. In this paper, a new model DTI-MGNN based on multi-channel graph convolutional network and graph attention is proposed for DTI prediction. We use two independent graph attention networks to learn the different interactions between nodes for the topology graph and feature graph with different strengths. At the same time, we use a graph convolutional network with shared weight matrices to learn the common information of the two graphs. The DTI-MGNN model combines topological structure and semantic features to improve the representation learning ability of DPPs, and obtain the state-of-the-art results on public datasets. Specifically, DTI-MGNN has achieved a high accuracy in identifying DTIs (the area under the receiver operating characteristic curve is 0.9665). |
---|---|
ISSN: | 1467-5463 1477-4054 |
DOI: | 10.1093/bib/bbab346 |