English Machine Translation Model Based on an Improved Self-Attention Technology

被引:1
|
作者
Pan, Wenxia [1 ]
机构
[1] Wuhan City Polytech, Wuhan 430060, Peoples R China
关键词
SENTIMENT ANALYSIS; PREDICTION; MECHANISM; NETWORKS;
D O I
10.1155/2021/2601480
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
English machine translation is a natural language processing research direction that has important scientific research value and practical value in the current artificial intelligence boom. Thevariability of language, the limited ability to express semantic information, and the lack of parallel corpus resources all limit the usefulness and popularity of English machine translation in practical applications. The self-attention mechanism has received a lot of attention in English machine translation tasks because of its highly parallelizable computing ability, which reduces the model's training time and allows it to capture the semantic relevance of all words in the context. The efficiency of the self-attention mechanism, however, differs from that of recurrent neural networks because it ignores the position and structure information between context words. The English machine translation model based on the self-attention mechanism uses sine and cosine position coding to represent the absolute position information of words in order to enable the model to use position information between words. This method, on the other hand, can reflect relative distance but does not provide directionality. As a result, a new model of English machine translation is proposed, which is based on the logarithmic position representation method and the self-attention mechanism. This model retains the distance and directional information between words, as well as the efficiency of the self-attention mechanism. Experiments show that the nonstrict phrase extraction method can effectively extract phrase translation pairs from the n-best word alignment results and that the extraction constraint strategy can improve translation quality even further. Nonstrict phrase extraction methods and n-best alignment results can significantly improve the quality of translation translations when compared to traditional phrase extraction methods based on single alignment.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] CYBERNETIC MODEL OF SELF-ATTENTION PROCESSES
    CARVER, CS
    [J]. JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 1979, 37 (08) : 1251 - 1281
  • [32] In-depth Recommendation Model Based on Self-Attention Factorization
    Ma, Hongshuang
    Liu, Qicheng
    [J]. KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2023, 17 (03): : 721 - 739
  • [33] A self-attention model for viewport prediction based on distance constraint
    Lan, ChengDong
    Qiu, Xu
    Miao, Chenqi
    Zheng, MengTing
    [J]. VISUAL COMPUTER, 2024, 40 (09): : 5997 - 6014
  • [34] A Self-attention Based Model for Offline Handwritten Text Recognition
    Nam Tuan Ly
    Trung Tan Ngo
    Nakagawa, Masaki
    [J]. PATTERN RECOGNITION, ACPR 2021, PT II, 2022, 13189 : 356 - 369
  • [35] Intelligent Recognition English Translation Model Based on Embedded Machine Learning and Improved GLR Algorithm
    Lei, Lei
    [J]. MOBILE INFORMATION SYSTEMS, 2022, 2022
  • [36] Self-Attention Based Video Summarization
    Li, Yiyi
    Wang, Jilong
    [J]. Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2020, 32 (04): : 652 - 659
  • [37] Enhancing low-resource neural machine translation with syntax-graph guided self-attention
    Gong, Longchao
    Li, Yan
    Guo, Junjun
    Yu, Zhengtao
    Gao, Shengxiang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 246
  • [38] Unsupervised Image-to-Image Translation with Self-Attention Networks
    Kang, Taewon
    Lee, Kwang Hee
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2020), 2020, : 102 - 108
  • [39] RETRACTED: Research on Intelligent English Translation Method Based on the Improved Attention Mechanism Model (Retracted Article)
    Wang, Rong
    [J]. SCIENTIFIC PROGRAMMING, 2021, 2021
  • [40] Lane Detection Method Based on Improved Multi-Head Self-Attention
    Ge, Zekun
    Tao, Fazhan
    Fu, Zhumu
    Song, Shuzhong
    [J]. Computer Engineering and Applications, 2024, 60 (02) : 264 - 271