An Effective Coverage Approach for Attention-based Neural Machine Translation

被引:0
|
作者
Hoang-Quan Nguyen [1 ]
Thuan-Minh Nguyen [1 ]
Huy-Hien Vu [1 ]
Van-Vinh Nguyen [1 ]
Phuong-Thai Nguyen [1 ]
Thi-Nga-My Dao [2 ]
Kieu-Hue Tran [2 ]
Khac-Quy Dinh [1 ]
机构
[1] Univ Engn & Technol, VNU Hanoi, Dept Comp Sci, Hanoi, Vietnam
[2] Univ Languages & Int Studies, VNU Hanoi, Fac Japanese Language & Culture, Hanoi, Vietnam
关键词
D O I
10.1109/nics48868.2019.9023793
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Neural Machine Translation recently has become the state-of-the-art approach in Machine Translation. One of the more advanced techniques concerning this approach, the attention model, tends to not use alignments from past translation steps and selects the context word purely using the devised attention score. Unfortunately, this sometimes leads to repetition and omission of important words in translations. To solve this problem, we propose a simple approach using coverage techniques that can be used in conjunction with a diverse number of attention models. Our experiments show that our improved technique increases the quality of translation on both English Vietnamese and Japanese - Vietnamese language pairings.
引用
收藏
页码:240 / 245
页数:6
相关论文
共 50 条
  • [21] Sentence Level Human Translation Quality Estimation with Attention-based Neural Networks
    Yuan, Yu
    Sharoff, Serge
    [J]. PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 1858 - 1865
  • [22] Neural Machine Translation with Deep Attention
    Zhang, Biao
    Xiong, Deyi
    Su, Jinsong
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (01) : 154 - 163
  • [23] Recurrent Attention for Neural Machine Translation
    Zeng, Jiali
    Wu, Shuangzhi
    Yin, Yongjing
    Jiang, Yufan
    Li, Mu
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3216 - 3225
  • [24] Syntax-Based Attention Masking for Neural Machine Translation
    McDonald, Colin
    Chiang, David
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 47 - 52
  • [25] Attention-via-Attention Neural Machine Translation
    Zhao, Shenjian
    Zhang, Zhihua
    [J]. THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 563 - 570
  • [26] On the Language Coverage Bias for Neural Machine Translation
    Wang, Shuo
    Tu, Zhaopeng
    Tan, Zhixing
    Shi, Shuming
    Sun, Maosong
    Liu, Yang
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4778 - 4790
  • [27] CRAN: An Hybrid CNN-RNN Attention-Based Model for Arabic Machine Translation
    Bensalah, Nouhaila
    Ayad, Habib
    Adib, Abdellah
    El Farouk, Abdelhamid Ibn
    [J]. NETWORKING, INTELLIGENT SYSTEMS AND SECURITY, 2022, 237 : 87 - 102
  • [28] An attention-based effective neural model for drug-drug interactions extraction
    Zheng, Wei
    Lin, Hongfei
    Luo, Ling
    Zhao, Zhehuan
    Li, Zhengguang
    Zhang, Yijia
    Yang, Zhihao
    Wang, Jian
    [J]. BMC BIOINFORMATICS, 2017, 18
  • [29] Attention-Based Neural Text Segmentation
    Badjatiya, Pinkesh
    Kurisinkel, Litton J.
    Gupta, Manish
    Varma, Vasudeva
    [J]. ADVANCES IN INFORMATION RETRIEVAL (ECIR 2018), 2018, 10772 : 180 - 193
  • [30] Neural Machine Translation as a Novel Approach to Machine Translation
    Benkova, Lucia
    Benko, Lubomir
    [J]. DIVAI 2020: 13TH INTERNATIONAL SCIENTIFIC CONFERENCE ON DISTANCE LEARNING IN APPLIED INFORMATICS, 2020, : 499 - 508