COVID-19 virus mutation prediction with LSTM and attention mechanisms

被引:1
|
作者
Burukanli, Mehmet [1 ,2 ]
Yumusak, Nejat [2 ]
机构
[1] Bitlis Eren Univ, Dept Common Courses, Ahmet Eren Blvd, TR-13100 Bitlis, Turkiye
[2] Sakarya Univ, Fac Comp & Informat Sci, Dept Comp Engn, TR-54050 Serdivan, Sakarya, Turkiye
来源
COMPUTER JOURNAL | 2024年 / 67卷 / 10期
关键词
COVID-19 mutation prediction; HyperMixer; attention mechanisms; MLP-Mixer; LSTM; SARS-CoV-2;
D O I
10.1093/comjnl/bxae058
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Coronavirus disease 2019 (COVID-19), caused by Severe Acute Respiratory Syndrome Coronavirus 2, is an emerging and rapidly spreading type of coronavirus. One of the most important reasons for the rapid spread of the COVID-19 virus are the frequent mutations of the COVID-19 virus. One of the most important methods to overcome mutations of the COVID-19 virus is to predict these mutations before they occur. In this study, we propose a robust HyperMixer and long short-term memory based model with attention mechanisms, HyperAttCov, for COVID-19 virus mutation prediction. The proposed HyperAttCov model outperforms several state-of-the-art methods. Experimental results have showed that the proposed HyperAttCov model reached accuracy 70.0%, precision 92.0%, MCC 46.5% on the COVID-19 testing dataset. Similarly, the proposed HyperAttCov model reached accuracy 70.2%, precision 90.4%, MCC 46.2% on the COVID-19 testing dataset with an average of 10 random trail. Besides, When the proposed HyperAttCov model with 10 random trail has been compared with compared to the study in the literature, the average of performance values has been increased by accuracy 7.18%, precision 37.39%, MCC 49.51% on the testing dataset. As a result, the proposed HyperAttCov can successfully predict mutations occurring on the COVID-19 dataset in the 2022 year.
引用
收藏
页码:2934 / 2944
页数:11
相关论文
共 50 条
  • [41] Molecular Mechanisms Of COVID-19
    Li, Fang
    ACTA CRYSTALLOGRAPHICA A-FOUNDATION AND ADVANCES, 2022, 78 : A324 - A324
  • [42] Mechanisms of immunothrombosis in COVID-19
    Portier, Irina
    Campbell, Robert A.
    Denorme, Frederik
    CURRENT OPINION IN HEMATOLOGY, 2021, 28 (06) : 445 - 453
  • [43] Mechanisms of Stroke in COVID-19
    Spence, J. David
    de Freitas, Gabriel R.
    Pettigrew, L. Creed
    Ay, Hakan
    Liebeskind, David S.
    Kase, Carlos S.
    Del Brutto, Oscar H.
    Hankey, Graeme J.
    Venketasubramanian, Narayanaswamy
    CEREBROVASCULAR DISEASES, 2020, 49 (04) : 451 - 458
  • [44] A Novel Matrix Profile-Guided Attention LSTM Model for Forecasting COVID-19 Cases in USA
    Liu, Qian
    Fung, Daryl L. X.
    Lac, Leann
    Hu, Pingzhao
    FRONTIERS IN PUBLIC HEALTH, 2021, 9
  • [45] Adaptive Multi-Factor Quantitative Analysis and Prediction Models: Vaccination, Virus Mutation and Social Isolation on COVID-19
    Pei, Yuanyuan
    Li, Juan
    Xu, Songhua
    Xu, Yi
    FRONTIERS IN MEDICINE, 2022, 9
  • [46] Natural Language Interface for Covid-19 Amharic Database Using LSTM Encoder Decoder Architecture with Attention
    Degu, Ephrem Tadesse
    Aga, Rosa Tsegaye
    2021 INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY FOR DEVELOPMENT FOR AFRICA (ICT4DA), 2021, : 95 - 100
  • [47] Prediction of the incubation period for COVID-19 and future virus disease outbreaks
    Gussow, Ayal B.
    Auslander, Noam
    Wolf, Yuri, I
    Koonin, Eugene, V
    BMC BIOLOGY, 2020, 18 (01)
  • [48] Prediction of the incubation period for COVID-19 and future virus disease outbreaks
    Ayal B. Gussow
    Noam Auslander
    Yuri I. Wolf
    Eugene V. Koonin
    BMC Biology, 18
  • [49] Prediction of COVID-19 Data Using Improved ARIMA-LSTM Hybrid Forecast Models
    Jin, Yong-Chao
    Cao, Qian
    Wang, Ke-Nan
    Zhou, Yuan
    Cao, Yan-Peng
    Wang, Xi-Yin
    IEEE ACCESS, 2023, 11 : 67956 - 67967
  • [50] Analysis and Prediction of COVID-19 by using Recurrent LSTM Neural Network Model in Machine Learning
    Dharani, N. P.
    Bojja, Polaiah
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (05) : 171 - 178