Research on Volleyball Video Intelligent Description Technology Combining the Long-Term and Short-Term Memory Network and Attention Mechanism

被引:3
|
作者
Gao, Yuhua [1 ]
Mo, Yong [2 ]
Zhang, Heng [3 ]
Huang, Ruiyin [1 ]
Chen, Zilong [1 ]
机构
[1] Guangzhou Sport Univ, Guangzhou 510500, Guangdong, Peoples R China
[2] Guangdong Baiyun Univ, Guangzhou 510450, Guangdong, Peoples R China
[3] Yingshan Cty 1 Middle Sch, Yingshan 438700, Hubei, Peoples R China
关键词
D O I
10.1155/2021/7088837
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
With the development of computer technology, video description, which combines the key technologies in the field of natural language processing and computer vision, has attracted more and more researchers' attention. Among them, how to objectively and efficiently describe high-speed and detailed sports videos is the key to the development of the video description field. In view of the problems of sentence errors and loss of visual information in the generation of the video description text due to the lack of language learning information in the existing video description methods, a multihead model combining the long-term and short-term memory network and attention mechanism is proposed for the intelligent description of the volleyball video. Through the introduction of the attention mechanism, the model pays much attention to the significant areas in the video when generating sentences. Through the comparative experiment with different models, the results show that the model with the attention mechanism can effectively solve the loss of visual information. Compared with the LSTM and base model, the multihead model proposed in this paper, which combines the long-term and short-term memory network and attention mechanism, has higher scores in all evaluation indexes and significantly improved the quality of the intelligent text description of the volleyball video.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Analysis of Volleyball Video Intelligent Description Technology Based on Computer Memory Network and Attention Mechanism
    Zhang, Zhongzi
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [2] Research on Attention Classification Based on Long Short-term Memory Network
    Wang Pai
    Wu Fan
    Wang Mei
    Qin Xue-Bin
    2020 5TH INTERNATIONAL CONFERENCE ON MECHANICAL, CONTROL AND COMPUTER ENGINEERING (ICMCCE 2020), 2020, : 1148 - 1151
  • [3] Short-Term Photovoltaic Power Forecasting Based on Long Short Term Memory Neural Network and Attention Mechanism
    Zhou, Hangxia
    Zhang, Yujin
    Yang, Lingfan
    Liu, Qian
    Yan, Ke
    Du, Yang
    IEEE ACCESS, 2019, 7 : 78063 - 78074
  • [4] Combining bidirectional long short-term memory and self-attention mechanism for code search
    Cao, Ben
    Liu, Jianxun
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (10):
  • [5] The late ND attention waveform is sensitive to short-term memory, but not long-term memory
    Singhal, A
    Fowler, B
    INTERNATIONAL JOURNAL OF PSYCHOPHYSIOLOGY, 2002, 45 (1-2) : 130 - 130
  • [6] Forecasting carbon price with attention mechanism and bidirectional long short-term memory network
    Qin, Chaoyong
    Qin, Dongling
    Jiang, Qiuxian
    Zhu, Bangzhu
    ENERGY, 2024, 299
  • [7] Long-term Leap Attention, Short-term Periodic Shift for Video Classification
    Zhang, Hao
    Cheng, Lechao
    Hao, Yanbin
    Ngo, Chong-wah
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 5773 - 5782
  • [8] Sentiment classification using attention mechanism and bidirectional long short-term memory network
    Wu, Peng
    Li, Xiaotong
    Ling, Chen
    Ding, Shengchun
    Shen, Si
    APPLIED SOFT COMPUTING, 2021, 112
  • [9] Intelligent Pavement Roughness Forecasting Based on a Long Short-Term Memory Model with Attention Mechanism
    Guo, Feng
    Qian, Yu
    AIRFIELD AND HIGHWAY PAVEMENTS 2021: PAVEMENT DESIGN, CONSTRUCTION, AND CONDITION EVALUATION, 2021, : 128 - 136
  • [10] A forecast model of short-term wind speed based on the attention mechanism and long short-term memory
    Xing, Wang
    Qi-liang, Wu
    Gui-rong, Tan
    Dai-li, Qian
    Ke, Zhou
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (15) : 45603 - 45623