Enhancer-LSTMAtt: A Bi-LSTM and Attention-Based Deep Learning Method for Enhancer Recognition

被引:9
|
作者
Huang, Guohua [1 ]
Luo, Wei [1 ]
Zhang, Guiyang [1 ]
Zheng, Peijie [1 ]
Yao, Yuhua [2 ]
Lyu, Jianyi [1 ]
Liu, Yuewu [3 ]
Wei, Dong-Qing [4 ,5 ]
机构
[1] Shaoyang Univ, Sch Elect Engn, Shaoyang 422000, Peoples R China
[2] Hainan Normal Univ, Sch Math & Stat, Haikou 571158, Hainan, Peoples R China
[3] Hunan Agr Univ, Coll Informat & Intelligence, Changsha 410083, Peoples R China
[4] Shanghai Jiao Tong Univ, State Key Lab Microbial Metab, Shanghai 200240, Peoples R China
[5] Shanghai Jiao Tong Univ, Sch Life Sci & Biotechnol, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金;
关键词
enhancer; promoter; deep learning; feed-forward attention; convolution neural network; long-short term memory; residual neural network; CD-HIT; TRANSCRIPTIONAL ENHANCERS; IDENTIFYING ENHANCERS; PREDICTING ENHANCERS; CHROMATIN SIGNATURES; PROTEIN; MODEL; DISCRIMINATION; EVOLUTION; ELEMENTS;
D O I
10.3390/biom12070995
中图分类号
Q5 [生物化学]; Q7 [分子生物学];
学科分类号
071010 ; 081704 ;
摘要
Enhancers are short DNA segments that play a key role in biological processes, such as accelerating transcription of target genes. Since the enhancer resides anywhere in a genome sequence, it is difficult to precisely identify enhancers. We presented a bi-directional long-short term memory (Bi-LSTM) and attention-based deep learning method (Enhancer-LSTMAtt) for enhancer recognition. Enhancer-LSTMAtt is an end-to-end deep learning model that consists mainly of deep residual neural network, Bi-LSTM, and feed-forward attention. We extensively compared the Enhancer-LSTMAtt with 19 state-of-the-art methods by 5-fold cross validation, 10-fold cross validation and independent test. Enhancer-LSTMAtt achieved competitive performances, especially in the independent test. We realized Enhancer-LSTMAtt into a user-friendly web application. Enhancer-LSTMAtt is applicable not only to recognizing enhancers, but also to distinguishing strong enhancer from weak enhancers. Enhancer-LSTMAtt is believed to become a promising tool for identifying enhancers.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Attention-Based Bi-LSTM for Chinese Named Entity Recognition
    Zhang, Kai
    Ren, Weiping
    Zhang, Yangsen
    [J]. CHINESE LEXICAL SEMANTICS, CLSW 2018, 2018, 11173 : 643 - 652
  • [2] An Attention-based Bi-LSTM Method for Visual Object Classification via EEG
    Zheng, Xiao
    Chen, Wanzhong
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2021, 63
  • [3] Attention-Based Bi-LSTM Model for Arabic Depression Classification
    Almars, Abdulqader M.
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 71 (02): : 3091 - 3106
  • [4] Attention-Based Bi-LSTM Network for Abusive Language Detection
    Nelatoori, Kiran Babu
    Kommanti, Hima Bindu
    [J]. IETE JOURNAL OF RESEARCH, 2023, 69 (11) : 7884 - 7892
  • [5] Attention-based Bi-LSTM Model for Anomalous HTTP Traffic Detection
    Yu, Yuqi
    Liu, Guannan
    Yan, Hanbing
    Li, Hong
    Guan, Hongchao
    [J]. 2018 15TH INTERNATIONAL CONFERENCE ON SERVICE SYSTEMS AND SERVICE MANAGEMENT (ICSSSM), 2018,
  • [6] ADH-Enhancer: an attention-based deep hybrid framework for enhancer identification and strength prediction
    Mehmood, Faiza
    Arshad, Shazia
    Shoaib, Muhammad
    [J]. BRIEFINGS IN BIOINFORMATICS, 2024, 25 (02)
  • [7] Attention-Based Bi-LSTM for Anomaly Detection on Time-Series Data
    Mishra, Sanket
    Kshirsagar, Varad
    Dwivedula, Rohit
    Hota, Chittaranjan
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 129 - 140
  • [8] Attention-based Spatialized Word Embedding Bi-LSTM Model for Sentiment Analysis
    Zhu, Kun
    Samsudin, Nur Hana
    [J]. PERTANIKA JOURNAL OF SCIENCE AND TECHNOLOGY, 2024, 32 (01): : 79 - 98
  • [9] An improved Bi-LSTM method based on heterogeneous features fusion and attention mechanism for ECG recognition
    Song, Chaoyang
    Zhou, Zilong
    Yu, Yue
    Shi, Manman
    Zhang, Jingxiang
    [J]. COMPUTERS IN BIOLOGY AND MEDICINE, 2024, 169
  • [10] Recognition method of voltage sag causes based on Bi-LSTM
    Zheng, Zhicong
    Qi, Linhai
    Wang, Hong
    Zhu, Manting
    Chen, Qian
    [J]. IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2020, 15 (03) : 418 - 425