Multiple sequence alignment based on deep reinforcement learning with self-attention and positional encoding

被引:0
|
作者
Liu, Yuhang [1 ]
Yuan, Hao [1 ]
Zhang, Qiang [1 ]
Wang, Zixuan [2 ]
Xiong, Shuwen [1 ]
Wen, Naifeng [3 ]
Zhang, Yongqing [1 ]
机构
[1] Chengdu Univ Informat Technol, Sch Comp Sci, Chengdu 610225, Peoples R China
[2] Sichuan Univ, Coll Elect & Informat Engn, Chengdu 610065, Peoples R China
[3] Dalian Minzu Univ, Sch Mech & Elect Engn, Dalian 116600, Peoples R China
基金
中国国家自然科学基金;
关键词
T-COFFEE;
D O I
10.1093/bioinformatics/btad636
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Motivation Multiple sequence alignment (MSA) is one of the hotspots of current research and is commonly used in sequence analysis scenarios. However, there is no lasting solution for MSA because it is a Nondeterministic Polynomially complete problem, and the existing methods still have room to improve the accuracy.Results We propose Deep reinforcement learning with Positional encoding and self-Attention for MSA, based on deep reinforcement learning, to enhance the accuracy of the alignment Specifically, inspired by the translation technique in natural language processing, we introduce self-attention and positional encoding to improve accuracy and reliability. Firstly, positional encoding encodes the position of the sequence to prevent the loss of nucleotide position information. Secondly, the self-attention model is used to extract the key features of the sequence. Then input the features into a multi-layer perceptron, which can calculate the insertion position of the gap according to the features. In addition, a novel reinforcement learning environment is designed to convert the classic progressive alignment into progressive column alignment, gradually generating each column's sub-alignment. Finally, merge the sub-alignment into the complete alignment. Extensive experiments based on several datasets validate our method's effectiveness for MSA, outperforming some state-of-the-art methods in terms of the Sum-of-pairs and Column scores.Availability and implementation The process is implemented in Python and available as open-source software from https://github.com/ZhangLab312/DPAMSA.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Revolutionizing sentiment classification: A deep learning approach using self-attention based encoding-decoding transformers with feature fusion
    Tejashwini, S. G.
    Aradhana, D.
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 125
  • [32] DeepCC: Multi-Agent Deep Reinforcement Learning Congestion Control for Multi-Path TCP Based on Self-Attention
    He, Bo
    Wang, Jingyu
    Qi, Qi
    Sun, Haifeng
    Liao, Jianxin
    Du, Chunning
    Yang, Xiang
    Han, Zhu
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2021, 18 (04): : 4770 - 4788
  • [33] Self-attention Adversarial Based Deep Subspace Clustering
    Yin M.
    Wu H.-Y.
    Xie S.-L.
    Yang Q.-Y.
    Zidonghua Xuebao/Acta Automatica Sinica, 2022, 48 (01): : 271 - 281
  • [34] Deep Learning-Based Identification of Maize Leaf Diseases Is Improved by an Attention Mechanism: Self-Attention
    Qian, Xiufeng
    Zhang, Chengqi
    Chen, Li
    Li, Ke
    FRONTIERS IN PLANT SCIENCE, 2022, 13
  • [35] Missing well-log reconstruction using a sequence self-attention deep-learning framework
    Lin, Lei
    Wei, Hao
    Wu, Tiantian
    Zhang, Pengyun
    Zhong, Zhi
    Li, Chenglong
    GEOPHYSICS, 2023, 88 (06) : D391 - D410
  • [36] SAGSleepNet: A deep learning model for sleep staging based on self-attention graph of polysomnography
    Jin, Zheng
    Jia, Kebin
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86
  • [37] BAMS: Binary Sequence-Augmented Spectrogram with Self-Attention Deep Learning for Human Activity Recognition
    Sricom, Natchaya
    Charakorn, Rujikorn
    Manoonpong, Poramate
    Limpiti, Tulaya
    2024 IEEE 20TH INTERNATIONAL CONFERENCE ON BODY SENSOR NETWORKS, BSN, 2024,
  • [38] Deep Clustering Efficient Learning Network for Motion Recognition Based on Self-Attention Mechanism
    Ru, Tielin
    Zhu, Ziheng
    APPLIED SCIENCES-BASEL, 2023, 13 (05):
  • [39] A fault diagnosis algorithm for analog circuits based on self-attention mechanism deep learning
    Yang D.
    Wei J.
    Lin X.
    Liu M.
    Lu S.
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2023, 44 (03): : 128 - 136
  • [40] A Cross-Project Defect Prediction Model Based on Deep Learning With Self-Attention
    Wen, Wanzhi
    Zhang, Ruinian
    Wang, Chuyue
    Shen, Chenqiang
    Yu, Meng
    Zhang, Suchuan
    Gao, Xinxin
    IEEE ACCESS, 2022, 10 : 110385 - 110401