English Machine Translation Model Based on an Improved Self-Attention Technology

被引:1
|
作者
Pan, Wenxia [1 ]
机构
[1] Wuhan City Polytech, Wuhan 430060, Peoples R China
关键词
SENTIMENT ANALYSIS; PREDICTION; MECHANISM; NETWORKS;
D O I
10.1155/2021/2601480
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
English machine translation is a natural language processing research direction that has important scientific research value and practical value in the current artificial intelligence boom. Thevariability of language, the limited ability to express semantic information, and the lack of parallel corpus resources all limit the usefulness and popularity of English machine translation in practical applications. The self-attention mechanism has received a lot of attention in English machine translation tasks because of its highly parallelizable computing ability, which reduces the model's training time and allows it to capture the semantic relevance of all words in the context. The efficiency of the self-attention mechanism, however, differs from that of recurrent neural networks because it ignores the position and structure information between context words. The English machine translation model based on the self-attention mechanism uses sine and cosine position coding to represent the absolute position information of words in order to enable the model to use position information between words. This method, on the other hand, can reflect relative distance but does not provide directionality. As a result, a new model of English machine translation is proposed, which is based on the logarithmic position representation method and the self-attention mechanism. This model retains the distance and directional information between words, as well as the efficiency of the self-attention mechanism. Experiments show that the nonstrict phrase extraction method can effectively extract phrase translation pairs from the n-best word alignment results and that the extraction constraint strategy can improve translation quality even further. Nonstrict phrase extraction methods and n-best alignment results can significantly improve the quality of translation translations when compared to traditional phrase extraction methods based on single alignment.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] The Case for Translation-Invariant Self-Attention in Transformer-Based Language Models
    Wennberg, Ulme
    Henter, Gustav Eje
    [J]. ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 130 - 140
  • [42] Lane Detection Method Based on Improved Multi-Head Self-Attention
    Ge, Zekun
    Tao, Fazhan
    Fu, Zhumu
    Song, Shuzhong
    [J]. Computer Engineering and Applications, 60 (02): : 264 - 271
  • [43] RETRACTED: Research on Intelligent English Translation Method Based on the Improved Attention Mechanism Model (Retracted Article)
    Wang, Rong
    [J]. SCIENTIFIC PROGRAMMING, 2021, 2021
  • [44] Session interest model for CTR prediction based on self-attention mechanism
    Wang, Qianqian
    Liu, Fang'ai
    Zhao, Xiaohui
    Tan, Qiaoqiao
    [J]. SCIENTIFIC REPORTS, 2022, 12 (01)
  • [45] Solar irradiance prediction based on self-attention recursive model network
    Kang, Ting
    Wang, Huaizhi
    Wu, Ting
    Peng, Jianchun
    Jiang, Hui
    [J]. Frontiers in Energy Research, 2022, 10
  • [46] Automatic translation of spoken English based on improved machine learning algorithms
    Kang, Jie
    [J]. JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2021,
  • [47] Automatic translation of spoken English based on improved machine learning algorithm
    Lin, Lin
    Liu, Jie
    Zhang, Xuebing
    Liang, Xiufang
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 40 (02) : 2385 - 2395
  • [48] An Improved Hierarchical Phrase Based Machine Translation Model
    Liu, Zhanyi
    Liu, Ting
    Li, Sheng
    [J]. 2011 AASRI CONFERENCE ON APPLIED INFORMATION TECHNOLOGY (AASRI-AIT 2011), VOL 1, 2011, : 239 - 242
  • [49] Cascade Prediction model based on Dynamic Graph Representation and Self-Attention
    Zhang F.
    Wang X.
    Wang R.
    Tang Q.
    Han Y.
    [J]. Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2022, 51 (01): : 83 - 90
  • [50] Deep Learning-Based Identification of Maize Leaf Diseases Is Improved by an Attention Mechanism: Self-Attention
    Qian, Xiufeng
    Zhang, Chengqi
    Chen, Li
    Li, Ke
    [J]. FRONTIERS IN PLANT SCIENCE, 2022, 13