Construction and improvement of English vocabulary learning model integrating spiking neural network and convolutional long short-term memory algorithm

被引:2
|
作者
Wang, Yunxia [1 ]
机构
[1] Nanyang Med Coll, Nanyang, Henan, Peoples R China
来源
PLOS ONE | 2024年 / 19卷 / 03期
关键词
CNN-LSTM; CLASSIFICATION; CONVLSTM;
D O I
10.1371/journal.pone.0299425
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
To help non-native English speakers quickly master English vocabulary, and improve reading, writing, listening and speaking skills, and communication skills, this study designs, constructs, and improves an English vocabulary learning model that integrates Spiking Neural Network (SNN) and Convolutional Long Short-Term Memory (Conv LSTM) algorithms. The fusion of SNN and Conv LSTM algorithm can fully utilize the advantages of SNN in processing temporal information and Conv LSTM in sequence data modeling, and implement a fusion model that performs well in English vocabulary learning. By adding information transfer and interaction modules, the feature learning and the timing information processing are optimized to improve the vocabulary learning ability of the model in different text contents. The training set used in this study is an open data set from the WordNet and Oxford English Corpus data corpora. The model is presented as a computer program and applied to an English learning application program, an online vocabulary learning platform, or a language education software. The experiment will use the open data set to generate a test set with text volume ranging from 100 to 4000. The performance indicators of the proposed fusion model are compared with those of five traditional models and applied to the latest vocabulary exercises. From the perspective of learners, 10 kinds of model accuracy, loss, polysemy processing accuracy, training time, syntactic structure capturing accuracy, vocabulary coverage, F1-score, context understanding accuracy, word sense disambiguation accuracy, and word order relation processing accuracy are considered. The experimental results reveal that the performance of the fusion model is better under different text sizes. In the range of 100-400 text volume, the accuracy is 0.75-0.77, the loss is less than 0.45, the F1-score is greater than 0.75, the training time is within 300s, and the other performance indicators are more than 65%; In the range of 500-1000 text volume, the accuracy is 0.81-0.83, the loss is not more than 0.40, the F1-score is not less than 0.78, the training time is within 400s, and the other performance indicators are above 70%; In the range of 1500-3000 text volume, the accuracy is 0.82-0.84, the loss is less than 0.28, the F1-score is not less than 0.78, the training time is within 600s, and the remaining performance indicators are higher than 70%. The fusion model can adapt to various types of questions in practical application. After the evaluation of professional teachers, the average scores of the choice, filling-in-the-blank, spelling, matching, exercises, and synonyms are 85.72, 89.45, 80.31, 92.15, 87.62, and 78.94, which are much higher than other traditional models. This shows that as text volume increases, the performance of the fusion model is gradually improved, indicating higher accuracy and lower loss. At the same time, in practical application, the fusion model proposed in this study has a good effect on English learning tasks and offers greater benefits for people unfamiliar with English vocabulary structure, grammar, and question types. This study aims to provide efficient and accurate natural language processing tools to help non-native English speakers understand and apply language more easily, and improve English vocabulary learning and comprehension.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Encrypted Traffic Classification with a Convolutional Long Short-Term Memory Neural Network<bold> </bold>
    Zou, Zhuang
    Ge, Jingguo
    Zheng, Hongbo
    Wu, Yulei
    Han, Chunjing
    Yao, Zhongjiang
    IEEE 20TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS / IEEE 16TH INTERNATIONAL CONFERENCE ON SMART CITY / IEEE 4TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND SYSTEMS (HPCC/SMARTCITY/DSS), 2018, : 329 - 334
  • [42] Driver drowsiness detection using hybrid convolutional neural network and long short-term memory
    Guo, Jing-Ming
    Markoni, Herleeyandi
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (20) : 29059 - 29087
  • [43] Real-Time Short-Term Voltage Stability Assessment Using Combined Temporal Convolutional Neural Network and Long Short-Term Memory Neural Network
    Adhikari, Ananta
    Naetiladdanon, Sumate
    Sangswang, Anawach
    APPLIED SCIENCES-BASEL, 2022, 12 (13):
  • [44] Self-organized Short-Term Memory Mechanism in Spiking Neural Network
    Kiselev, Mikhail
    ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, PT I, 2011, 6593 : 120 - 129
  • [45] Research on gesture EMG recognition based on long short-term memory and convolutional neural network
    Chen, Sijia
    Luo, Zhizeng
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2021, 42 (02): : 162 - 170
  • [46] Driver drowsiness detection using hybrid convolutional neural network and long short-term memory
    Jing-Ming Guo
    Herleeyandi Markoni
    Multimedia Tools and Applications, 2019, 78 : 29059 - 29087
  • [47] Convolutional Grid Long Short-Term Memory Recurrent Neural Network for Automatic Speech Recognition
    Xue, Jiabin
    Zheng, Tieran
    Han, Jiqing
    NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 718 - 726
  • [48] DeepYield: A combined convolutional neural network with long short-term memory for crop yield forecasting
    Gavahi, Keyhan
    Abbaszadeh, Peyman
    Moradkhani, Hamid
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 184
  • [49] Recognition of aggressive episodes of pigs based on convolutional neural network and long short-term memory
    Chen, Chen
    Zhu, Weixing
    Steibel, Juan
    Siegford, Janice
    Wurtz, Kaitlin
    Han, Junjie
    Norton, Tomas
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 169
  • [50] Convolutional neural network and long short-term memory models for ice-jam predictions
    Madaeni, Fatemehalsadat
    Chokmani, Karem
    Lhissou, Rachid
    Gauthier, Yves
    Tolszczuk-Leclerc, Simon
    CRYOSPHERE, 2022, 16 (04): : 1447 - 1468