DeepCAC: a deep learning approach on DNA transcription factors classification based on multi-head self-attention and concatenate convolutional neural network

被引:2
|
作者
Zhang, Jidong [1 ]
Liu, Bo [2 ]
Wu, Jiahui [1 ]
Wang, Zhihan [1 ]
Li, Jianqiang [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
[2] Massey Univ, Sch Math & Computat Sci, Auckland 0745, New Zealand
关键词
Bioinformatics; Attention mechanism; DNA transcription factors sequence; Convolutional neural networks; BINDING PROTEINS; RNA;
D O I
10.1186/s12859-023-05469-9
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Understanding gene expression processes necessitates the accurate classification and identification of transcription factors, which is supported by high-throughput sequencing technologies. However, these techniques suffer from inherent limitations such as time consumption and high costs. To address these challenges, the field of bioinformatics has increasingly turned to deep learning technologies for analyzing gene sequences. Nevertheless, the pursuit of improved experimental results has led to the inclusion of numerous complex analysis function modules, resulting in models with a growing number of parameters. To overcome these limitations, it is proposed a novel approach for analyzing DNA transcription factor sequences, which is named as DeepCAC. This method leverages deep convolutional neural networks with a multi-head self-attention mechanism. By employing convolutional neural networks, it can effectively capture local hidden features in the sequences. Simultaneously, the multi-head self-attention mechanism enhances the identification of hidden features with long-distant dependencies. This approach reduces the overall number of parameters in the model while harnessing the computational power of sequence data from multi-head self-attention. Through training with labeled data, experiments demonstrate that this approach significantly improves performance while requiring fewer parameters compared to existing methods. Additionally, the effectiveness of our approach is validated in accurately predicting DNA transcription factor sequences.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] DeepCAC: a deep learning approach on DNA transcription factors classification based on multi-head self-attention and concatenate convolutional neural network
    Jidong Zhang
    Bo Liu
    Jiahui Wu
    Zhihan Wang
    Jianqiang Li
    BMC Bioinformatics, 24
  • [2] Convolutional multi-head self-attention on memory for aspect sentiment classification
    Zhang, Yaojie
    Xu, Bing
    Zhao, Tiejun
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2020, 7 (04) : 1038 - 1044
  • [3] Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification
    Yaojie Zhang
    Bing Xu
    Tiejun Zhao
    IEEE/CAA Journal of Automatica Sinica, 2020, 7 (04) : 1038 - 1044
  • [4] EEG-Based Emotion Recognition Using Convolutional Recurrent Neural Network with Multi-Head Self-Attention
    Hu, Zhangfang
    Chen, Libujie
    Luo, Yuan
    Zhou, Jingfan
    APPLIED SCIENCES-BASEL, 2022, 12 (21):
  • [5] A novel intelligent fault diagnosis method of bearing based on multi-head self-attention convolutional neural network
    Ren, Hang
    Liu, Shaogang
    Qiu, Bo
    Guo, Hong
    Zhao, Dan
    AI EDAM-ARTIFICIAL INTELLIGENCE FOR ENGINEERING DESIGN ANALYSIS AND MANUFACTURING, 2024, 38
  • [6] CNN-MHSA: A Convolutional Neural Network and multi-head self-attention combined approach for detecting phishing websites
    Xiao, Xi
    Zhang, Dianyan
    Hu, Guangwu
    Jiang, Yong
    Xia, Shutao
    NEURAL NETWORKS, 2020, 125 : 303 - 312
  • [7] Multi-Head Self-Attention Gated-Dilated Convolutional Neural Network for Word Sense Disambiguation
    Zhang, Chun-Xiang
    Zhang, Yu-Long
    Gao, Xue-Yao
    IEEE ACCESS, 2023, 11 : 14202 - 14210
  • [8] AttenEpilepsy: A 2D convolutional network model based on multi-head self-attention
    Ma, Shuang
    Wang, Haifeng
    Yu, Zhihao
    Du, Luyao
    Zhang, Ming
    Fu, Qingxi
    Engineering Analysis with Boundary Elements, 2024, 169
  • [9] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [10] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79