Incorporating Word Reordering Knowledge into Attention-based Neural Machine Translation

被引:17
|
作者
Zhang, Jinchao [1 ]
Wang, Mingxuan [1 ]
Liu, Qun [1 ,3 ]
Zhou, Jie [2 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing, Peoples R China
[2] Baidu Inc, Baidu Res Inst Deep Learning, Beijing, Peoples R China
[3] Dublin City Univ, Sch Comp, ADAPT Ctr, Dublin, Ireland
基金
爱尔兰科学基金会;
关键词
D O I
10.18653/v1/P17-1140
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This paper proposes three distortion models to explicitly incorporate the word reordering knowledge into attention-based Neural Machine Translation (NMT) for further improving translation performance. Our proposed models enable attention mechanism to attend to source words regarding both the semantic requirement and the word reordering penalty. Experiments on Chinese-English translation show that the approaches can improve word alignment quality and achieve significant translation improvements over a basic attention-based NMT by large margins. Compared with previous works on identical corpora, our system achieves the state-of-the-art performance on translation quality.
引用
收藏
页码:1524 / 1534
页数:11
相关论文
共 50 条
  • [1] Incorporating Statistical Machine Translation Word Knowledge Into Neural Machine Translation
    Wang, Xing
    Tu, Zhaopeng
    Zhang, Min
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (12) : 2255 - 2266
  • [2] Recursive Annotations for Attention-Based Neural Machine Translation
    Ye, Shaolin
    Guo, Wu
    [J]. 2017 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2017, : 164 - 167
  • [3] Neural Machine Translation Models with Attention-Based Dropout Layer
    Israr, Huma
    Khan, Safdar Abbas
    Tahir, Muhammad Ali
    Shahzad, Muhammad Khuram
    Ahmad, Muneer
    Zain, Jasni Mohamad
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 75 (02): : 2981 - 3009
  • [4] An Effective Coverage Approach for Attention-based Neural Machine Translation
    Hoang-Quan Nguyen
    Thuan-Minh Nguyen
    Huy-Hien Vu
    Van-Vinh Nguyen
    Phuong-Thai Nguyen
    Thi-Nga-My Dao
    Kieu-Hue Tran
    Khac-Quy Dinh
    [J]. PROCEEDINGS OF 2019 6TH NATIONAL FOUNDATION FOR SCIENCE AND TECHNOLOGY DEVELOPMENT (NAFOSTED) CONFERENCE ON INFORMATION AND COMPUTER SCIENCE (NICS), 2019, : 240 - 245
  • [5] Word Reordering as a Preprocessor for Machine Translation Systems
    Devendrakumar, R. N.
    Praveena, A.
    [J]. PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON INVENTIVE SYSTEMS AND CONTROL (ICISC 2017), 2017, : 833 - 836
  • [6] Neural Machine Translation with Reordering Embeddings
    Chen, Kehai
    Wang, Rui
    Utiyama, Masao
    Sumita, Eiichiro
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1787 - 1799
  • [7] A Novel Word Reordering Method for Statistical Machine Translation
    Zang, Shuo
    Zhao, Hai
    Wu, Chunyang
    Wang, Rui
    [J]. 2015 12TH INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY (FSKD), 2015, : 843 - 848
  • [8] Towards the implementation of an Attention-based Neural Machine Translation with artificial pronunciation for Nahuatl as a mobile application
    Bello Garcia, Sergio Khalil
    Sanchez Lucero, Eduardo
    Pedroza Mendez, Blanca Estela
    Hernandez Hernandez, Jose Crispin
    Bonilla Huerta, Edmundo
    Ramirez Cruz, Jose Federico
    [J]. 2020 8TH EDITION OF THE INTERNATIONAL CONFERENCE IN SOFTWARE ENGINEERING RESEARCH AND INNOVATION (CONISOFT 2020), 2020, : 235 - 244
  • [9] Incorporating Syntactic Knowledge in Neural Quality Estimation for Machine Translation
    Ye, Na
    Wang, Yuanyuan
    Cai, Dongfeng
    [J]. MACHINE TRANSLATION, CCMT 2019, 2019, 1104 : 23 - 34
  • [10] A Dependency-Based Neural Reordering Model for Statistical Machine Translation
    Hadiwinoto, Christian
    Ng, Hwee Tou
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 109 - 115