Modeling Coverage for Non-Autoregressive Neural Machine Translation

被引:1
|
作者
Shan, Yong
Feng, Yang [1 ]
Shao, Chenze
机构
[1] Chinese Acad Sci ICT CAS, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing, Peoples R China
关键词
D O I
10.1109/IJCNN52387.2021.9533529
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Non-Autoregressive Neural Machine Translation (NAT) has achieved significant inference speedup by generating all tokens simultaneously. Despite its high efficiency, NAT usually suffers from two kinds of translation errors: over-translation (e.g. repeated tokens) and under-translation (e.g. missing translations), which eventually limits the translation quality. In this paper, we argue that these issues of NAT can be addressed through coverage modeling, which has been proved to be useful in autoregressive decoding. We propose a novel Coverage-NAT to model the coverage information directly by a token-level coverage iterative refinement mechanism and a sentence-level coverage agreement, which can remind the model if a source token has been translated or not and improve the semantics consistency between the translation and the source, respectively. Experimental results on WMT14 En <-> De and WMT16 En <-> Ro translation tasks show that our method can alleviate those errors and achieve strong improvements over the baseline system.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] A Survey of Non-Autoregressive Neural Machine Translation
    Li, Feng
    Chen, Jingxian
    Zhang, Xuejun
    [J]. ELECTRONICS, 2023, 12 (13)
  • [2] Glancing Transformer for Non-Autoregressive Neural Machine Translation
    Qian, Lihua
    Zhou, Hao
    Bao, Yu
    Wang, Mingxuan
    Qiu, Lin
    Zhang, Weinan
    Yu, Yong
    Li, Lei
    [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 1993 - 2003
  • [3] Learning to Rewrite for Non-Autoregressive Neural Machine Translation
    Geng, Xinwei
    Feng, Xiaocheng
    Qin, Bing
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3297 - 3308
  • [4] Imitation Learning for Non-Autoregressive Neural Machine Translation
    Wei, Bingzhen
    Wang, Mingxuan
    Zhou, Hao
    Lin, Junyang
    Sun, Xu
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1304 - 1312
  • [5] Uncertainty-aware non-autoregressive neural machine translation
    Liu, Chuanming
    Yu, Jingqi
    [J]. COMPUTER SPEECH AND LANGUAGE, 2023, 78
  • [6] Non-autoregressive neural machine translation with auxiliary representation fusion
    Du, Quan
    Feng, Kai
    Xu, Chen
    Xiao, Tong
    Zhu, Jingbo
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 41 (06) : 7229 - 7239
  • [7] Improving Non-autoregressive Neural Machine Translation with Monolingual Data
    Zhou, Jiawei
    Keung, Phillip
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1893 - 1898
  • [8] Non-Autoregressive Neural Machine Translation with Enhanced Decoder Input
    Guo, Junliang
    Tan, Xu
    He, Di
    Qin, Tao
    Xu, Linli
    Liu, Tie-Yan
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 3723 - 3730
  • [9] A Survey on Non-Autoregressive Generation for Neural Machine Translation and Beyond
    Xiao, Yisheng
    Wu, Lijun
    Guo, Junliang
    Li, Juntao
    Zhang, Min
    Qin, Tao
    Liu, Tie-Yan
    [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45 (10) : 11407 - 11427
  • [10] Fully Non-autoregressive Neural Machine Translation: Tricks of the Trade
    Gu, Jiatao
    Kong, Xiang
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 120 - 133