DeepCAC: a deep learning approach on DNA transcription factors classification based on multi-head self-attention and concatenate convolutional neural network

被引:5
|
作者
Zhang, Jidong [1 ]
Liu, Bo [2 ]
Wu, Jiahui [1 ]
Wang, Zhihan [1 ]
Li, Jianqiang [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
[2] Massey Univ, Sch Math & Computat Sci, Auckland 0745, New Zealand
关键词
Bioinformatics; Attention mechanism; DNA transcription factors sequence; Convolutional neural networks; BINDING PROTEINS; RNA;
D O I
10.1186/s12859-023-05469-9
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Understanding gene expression processes necessitates the accurate classification and identification of transcription factors, which is supported by high-throughput sequencing technologies. However, these techniques suffer from inherent limitations such as time consumption and high costs. To address these challenges, the field of bioinformatics has increasingly turned to deep learning technologies for analyzing gene sequences. Nevertheless, the pursuit of improved experimental results has led to the inclusion of numerous complex analysis function modules, resulting in models with a growing number of parameters. To overcome these limitations, it is proposed a novel approach for analyzing DNA transcription factor sequences, which is named as DeepCAC. This method leverages deep convolutional neural networks with a multi-head self-attention mechanism. By employing convolutional neural networks, it can effectively capture local hidden features in the sequences. Simultaneously, the multi-head self-attention mechanism enhances the identification of hidden features with long-distant dependencies. This approach reduces the overall number of parameters in the model while harnessing the computational power of sequence data from multi-head self-attention. Through training with labeled data, experiments demonstrate that this approach significantly improves performance while requiring fewer parameters compared to existing methods. Additionally, the effectiveness of our approach is validated in accurately predicting DNA transcription factor sequences.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] DeepMHADTA: Prediction of Drug-Target Binding Affinity Using Multi-Head Self-Attention and Convolutional Neural Network
    Deng, Lei
    Zeng, Yunyun
    Liu, Hui
    Liu, Zixuan
    Liu, Xuejun
    CURRENT ISSUES IN MOLECULAR BIOLOGY, 2022, 44 (05) : 2287 - 2299
  • [32] CPMA: Spatio-Temporal Network Prediction Model Based on Convolutional Parallel Multi-head Self-attention
    Liu, Tiantian
    You, Xin
    Ma, Ming
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT II, ICIC 2024, 2024, 14876 : 113 - 124
  • [33] Improved Convolutional Neural Network Based on Multi-head Attention Mechanism for Industrial Process Fault Classification
    Cui, Wenzhi
    Deng, Xiaogang
    Zhang, Zheng
    PROCEEDINGS OF 2020 IEEE 9TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE (DDCLS'20), 2020, : 918 - 922
  • [34] Joint extraction of entities and relations based on character graph convolutional network and Multi-Head Self-Attention Mechanism
    Meng, Zhao
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2021, 33 (02) : 349 - 362
  • [35] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210
  • [36] Improved Multi-Head Self-Attention Classification Network for Multi-View Fetal Echocardiography Recognition
    Zhang, Yingying
    Zhu, Haogang
    Wang, Yan
    Wang, Jingyi
    He, Yihua
    2023 45TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY, EMBC, 2023,
  • [37] A multi-head self-attention deep learning approach for detection and recommendation of neuromagnetic high frequency oscillations in epilepsy
    Zhao, Xiangyu
    Peng, Xueping
    Niu, Ke
    Li, Hailong
    He, Lili
    Yang, Feng
    Wu, Ting
    Chen, Duo
    Zhang, Qiusi
    Ouyang, Menglin
    Guo, Jiayang
    Pan, Yijie
    FRONTIERS IN NEUROINFORMATICS, 2022, 16
  • [38] A Point Cloud Classification Method and Its Applications Based on Multi-Head Self-Attention
    Liu, Xue-Jun
    Wang, Wen-Hui
    Yan, Yong
    Cui, Zhong-Ji
    Sha, Yun
    Jiang, Yi-Nan
    Journal of Computers (Taiwan), 2023, 34 (04) : 163 - 173
  • [39] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793
  • [40] HADNet: A Novel Lightweight Approach for Abnormal Sound Detection on Highway Based on 1D Convolutional Neural Network and Multi-Head Self-Attention Mechanism
    Liang, Cong
    Chen, Qian
    Li, Qiran
    Wang, Qingnan
    Zhao, Kang
    Tu, Jihui
    Jafaripournimchahi, Ammar
    ELECTRONICS, 2024, 13 (21)