Drug-Target Interaction Prediction Using Multi-Head Self-Attention and Graph Attention Network

被引:209
|
作者
Cheng, Zhongjian [1 ]
Yan, Cheng [1 ,2 ]
Wu, Fang-Xiang [3 ]
Wang, Jianxin [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Hunan Prov Key Lab Bioinformat, Changsha 410083, Peoples R China
[2] Qiannan Normal Univ Nationalities, Sch Comp & Informat, Duyun 558000, Guizhou, Peoples R China
[3] Univ Saskatchewan, Dept Mech Engn, Div Biomed Engn, Saskatoon, SK S7N 5A9, Canada
基金
中国国家自然科学基金;
关键词
Proteins; Drugs; Predictive models; Amino acids; Feature extraction; Compounds; Biological system modeling; Drug-target interactions; multi-head self-attention; graph attention network;
D O I
10.1109/TCBB.2021.3077905
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Identifying drug-target interactions (DTIs) is an important step in the process of new drug discovery and drug repositioning. Accurate predictions for DTIs can improve the efficiency in the drug discovery and development. Although rapid advances in deep learning technologies have generated various computational methods, it is still appealing to further investigate how to design efficient networks for predicting DTIs. In this study, we propose an end-to-end deep learning method (called MHSADTI) to predict DTIs based on the graph attention network and multi-head self-attention mechanism. First, the characteristics of drugs and proteins are extracted by the graph attention network and multi-head self-attention mechanism, respectively. Then, the attention scores are used to consider which amino acid subsequence in a protein is more important for the drug to predict its interactions. Finally, we predict DTIs by a fully connected layer after obtaining the feature vectors of drugs and proteins. MHSADTI takes advantage of self-attention mechanism for obtaining long-dependent contextual relationship in amino acid sequences and predicting DTI interpretability. More effective molecular characteristics are also obtained by the attention mechanism in graph attention networks. Multiple cross validation experiments are adopted to assess the performance of our MHSADTI. The experiments on four datasets, human, C.elegans, DUD-E and DrugBank show our method outperforms the state-of-the-art methods in terms of AUC, Precision, Recall, AUPR and F1-score. In addition, the case studies further demonstrate that our method can provide effective visualizations to interpret the prediction results from biological insights.
引用
下载
收藏
页码:2208 / 2218
页数:11
相关论文
共 50 条
  • [41] Joint extraction of entities and relations based on character graph convolutional network and Multi-Head Self-Attention Mechanism
    Meng, Zhao
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2021, 33 (02) : 349 - 362
  • [42] Multi-modal multi-head self-attention for medical VQA
    Joshi, Vasudha
    Mitra, Pabitra
    Bose, Supratik
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (14) : 42585 - 42608
  • [43] A HYBRID TEXT NORMALIZATION SYSTEM USING MULTI-HEAD SELF-ATTENTION FOR MANDARIN
    Zhang, Junhui
    Pan, Junjie
    Yin, Xiang
    Li, Chen
    Liu, Shichao
    Zhang, Yang
    Wang, Yuxuan
    Ma, Zejun
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 6694 - 6698
  • [44] Multi-modal multi-head self-attention for medical VQA
    Vasudha Joshi
    Pabitra Mitra
    Supratik Bose
    Multimedia Tools and Applications, 2024, 83 : 42585 - 42608
  • [45] Detecting and Extracting of Adverse Drug Reaction Mentioning Tweets with Multi-Head Self-Attention
    Ge, Suyu
    Qi, Tao
    Wu, Chuhan
    Huang, Yongfeng
    SOCIAL MEDIA MINING FOR HEALTH APPLICATIONS (#SMM4H) WORKSHOP & SHARED TASK, 2019, : 96 - 98
  • [46] Multi-Head Attention and Knowledge Graph Based Dual Target Graph Collaborative Filtering Network
    Xu Yu
    Qinglong Peng
    Feng Jiang
    Junwei Du
    Hongtao Liang
    Jinhuan Liu
    Neural Processing Letters, 2023, 55 : 9155 - 9177
  • [47] Multi-Head Attention and Knowledge Graph Based Dual Target Graph Collaborative Filtering Network
    Yu, Xu
    Peng, Qinglong
    Jiang, Feng
    Du, Junwei
    Liang, Hongtao
    Liu, Jinhuan
    NEURAL PROCESSING LETTERS, 2023, 55 (07) : 9155 - 9177
  • [48] Click-Through Rate Prediction of Multi-Head Self-Attention in Hyperbolic Space
    Han Y.-L.
    Wang X.-Y.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2021, 44 (05): : 127 - 132
  • [49] Modality attention fusion model with hybrid multi-head self-attention for video understanding
    Zhuang, Xuqiang
    Liu, Fang'al
    Hou, Jian
    Hao, Jianhua
    Cai, Xiaohong
    PLOS ONE, 2022, 17 (10):
  • [50] An Effective Hyperspectral Image Classification Network Based on Multi-Head Self-Attention and Spectral-Coordinate Attention
    Zhang, Minghua
    Duan, Yuxia
    Song, Wei
    Mei, Haibin
    He, Qi
    JOURNAL OF IMAGING, 2023, 9 (07)