English Machine Translation Model Based on an Improved Self-Attention Technology

被引:1
|
作者
Pan, Wenxia [1 ]
机构
[1] Wuhan City Polytech, Wuhan 430060, Peoples R China
关键词
SENTIMENT ANALYSIS; PREDICTION; MECHANISM; NETWORKS;
D O I
10.1155/2021/2601480
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
English machine translation is a natural language processing research direction that has important scientific research value and practical value in the current artificial intelligence boom. Thevariability of language, the limited ability to express semantic information, and the lack of parallel corpus resources all limit the usefulness and popularity of English machine translation in practical applications. The self-attention mechanism has received a lot of attention in English machine translation tasks because of its highly parallelizable computing ability, which reduces the model's training time and allows it to capture the semantic relevance of all words in the context. The efficiency of the self-attention mechanism, however, differs from that of recurrent neural networks because it ignores the position and structure information between context words. The English machine translation model based on the self-attention mechanism uses sine and cosine position coding to represent the absolute position information of words in order to enable the model to use position information between words. This method, on the other hand, can reflect relative distance but does not provide directionality. As a result, a new model of English machine translation is proposed, which is based on the logarithmic position representation method and the self-attention mechanism. This model retains the distance and directional information between words, as well as the efficiency of the self-attention mechanism. Experiments show that the nonstrict phrase extraction method can effectively extract phrase translation pairs from the n-best word alignment results and that the extraction constraint strategy can improve translation quality even further. Nonstrict phrase extraction methods and n-best alignment results can significantly improve the quality of translation translations when compared to traditional phrase extraction methods based on single alignment.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Chinese-English Machine Translation Model Based On Transfer Learning And Self-attention
    Ma, Shu
    [J]. JOURNAL OF APPLIED SCIENCE AND ENGINEERING, 2024, 27 (08): : 3011 - 3019
  • [2] Re-Transformer: A Self-Attention Based Model for Machine Translation
    Liu, Huey-Ing
    Chen, Wei-Lin
    [J]. AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 3 - 10
  • [3] Self-Attention and Dynamic Convolution Hybrid Model for Neural Machine Translation
    Zhang, Zhebin
    Wu, Sai
    Chen, Gang
    Jiang, Dawei
    [J]. 11TH IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG 2020), 2020, : 352 - 359
  • [4] X-Transformer: A Machine Translation Model Enhanced by the Self-Attention Mechanism
    Liu, Huey-Ing
    Chen, Wei-Lin
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (09):
  • [5] Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation
    Raganato, Alessandro
    Scherrer, Yves
    Tiedemann, Jorg
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 556 - 568
  • [6] Enhancing Machine Translation with Dependency-Aware Self-Attention
    Bugliarello, Emanuele
    Okazaki, Naoaki
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1618 - 1627
  • [7] Multi-Granularity Self-Attention for Neural Machine Translation
    Hao, Jie
    Wang, Xing
    Shi, Shuming
    Zhang, Jinfeng
    Tu, Zhaopeng
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 887 - 897
  • [8] A neural machine translation method based on split graph convolutional self-attention encoding
    Wan, Fei
    Li, Ping
    [J]. PEERJ COMPUTER SCIENCE, 2024, 10
  • [9] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures
    Tang, Gongbo
    Mueller, Mathias
    Rios, Annette
    Sennrich, Rico
    [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4263 - 4272
  • [10] Contextualized dynamic meta embeddings based on Gated CNNs and self-attention for Arabic machine translation
    Bensalah, Nouhaila
    Ayad, Habib
    Adib, Abdellah
    El Farouk, Abdelhamid Ibn
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT COMPUTING AND CYBERNETICS, 2024, 17 (03) : 605 - 631