Predicting disease genes based on multi-head attention fusion

被引:3
|
作者
Zhang, Linlin [1 ]
Lu, Dianrong [2 ]
Bi, Xuehua [3 ]
Zhao, Kai [2 ]
Yu, Guanglei [3 ]
Quan, Na [2 ]
机构
[1] Xinjiang Univ, Coll Software Engn, Urumqi, Peoples R China
[2] Xinjiang Univ, Coll informat Sci & Engn, Urumqi, Peoples R China
[3] Xinjiang Med Univ, Med Engn & Technol Coll, Urumqi, Peoples R China
关键词
Pathogenic gene prediction; Heterogeneous network; Multi-head attention; Graph representation learning; GENOME-WIDE ASSOCIATION; HETEROGENEOUS NETWORKS; PRIORITIZATION; INTEGRATION; CANCER;
D O I
10.1186/s12859-023-05285-1
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
BackgroundThe identification of disease-related genes is of great significance for the diagnosis and treatment of human disease. Most studies have focused on developing efficient and accurate computational methods to predict disease-causing genes. Due to the sparsity and complexity of biomedical data, it is still a challenge to develop an effective multi-feature fusion model to identify disease genes.ResultsThis paper proposes an approach to predict the pathogenic gene based on multi-head attention fusion (MHAGP). Firstly, the heterogeneous biological information networks of disease genes are constructed by integrating multiple biomedical knowledge databases. Secondly, two graph representation learning algorithms are used to capture the feature vectors of gene-disease pairs from the network, and the features are fused by introducing multi-head attention. Finally, multi-layer perceptron model is used to predict the gene-disease association.ConclusionsThe MHAGP model outperforms all of other methods in comparative experiments. Case studies also show that MHAGP is able to predict genes potentially associated with diseases. In the future, more biological entity association data, such as gene-drug, disease phenotype-gene ontology and so on, can be added to expand the information in heterogeneous biological networks and achieve more accurate predictions. In addition, MHAGP with strong expansibility can be used for potential tasks such as gene-drug association and drug-disease association prediction.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] AGCNAF: predicting disease-gene associations using GCN and multi-head attention to fuse the similarity features
    Ma, Jinlong
    Qin, Tian
    Zhai, Meijing
    Cai, Liangliang
    ENGINEERING RESEARCH EXPRESS, 2024, 6 (04):
  • [32] Self Multi-Head Attention for Speaker Recognition
    India, Miquel
    Safari, Pooyan
    Hernando, Javier
    INTERSPEECH 2019, 2019, : 4305 - 4309
  • [33] DOUBLE MULTI-HEAD ATTENTION FOR SPEAKER VERIFICATION
    India, Miquel
    Safari, Pooyan
    Hernando, Javier
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6144 - 6148
  • [34] Learning Sentences Similarity By Multi-Head Attention
    Wang, Ming Yang
    Li, Chen Jiang
    Sun, Jian Dong
    Xu, Wei Ran
    Gao, Sheng
    Zhang, Ya Hao
    Wang, Pu
    Li, Jun Liang
    PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON NETWORK INFRASTRUCTURE AND DIGITAL CONTENT (IEEE IC-NIDC), 2018, : 16 - 19
  • [35] VIDEO SUMMARIZATION WITH ANCHORS AND MULTI-HEAD ATTENTION
    Sung, Yi-Lin
    Hong, Cheng-Yao
    Hsu, Yen-Chi
    Liu, Tyng-Luh
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 2396 - 2400
  • [36] Classification of Heads in Multi-head Attention Mechanisms
    Huang, Feihu
    Jiang, Min
    Liu, Fang
    Xu, Dian
    Fan, Zimeng
    Wang, Yonghao
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2022, PT III, 2022, 13370 : 681 - 692
  • [37] Diversifying Multi-Head Attention in the Transformer Model
    Ampazis, Nicholas
    Sakketou, Flora
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2024, 6 (04): : 2618 - 2638
  • [38] Finding the Pillars of Strength for Multi-Head Attention
    Ni, Jinjie
    Mao, Rui
    Yang, Zonglin
    Lei, Han
    Cambria, Erik
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 14526 - 14540
  • [39] Improving Multi-head Attention with Capsule Networks
    Gu, Shuhao
    Feng, Yang
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 314 - 326
  • [40] Abstractive Text Summarization with Multi-Head Attention
    Li, Jinpeng
    Zhang, Chuang
    Chen, Xiaojun
    Cao, Yanan
    Liao, Pengcheng
    Zhang, Peng
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,