Guiding Non-Autoregressive Neural Machine Translation Decoding with Reordering Information

被引:0
|
作者
Ran, Qiu [1 ]
Lin, Yankai [1 ]
Li, Peng [1 ]
Zhou, Jie [1 ]
机构
[1] Tencent Inc, WeChat AI, Pattern Recognit Ctr, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Non-autoregressive neural machine translation (NAT) generates each target word in parallel and has achieved promising inference acceleration. However, existing NAT models still have a big gap in translation quality compared to auto-regressive neural machine translation models due to the multimodality problem: the target words may come from multiple feasible translations. To address this problem, we propose a novel NAT framework ReorderNAT which explicitly models the reordering information to guide the decoding of NAT. Specially, ReorderNAT utilizes deterministic and non-deterministic decoding strategies that leverage reordering information as a proxy for the final translation to encourage the decoder to choose words belonging to the same translation. Experimental results on various widely-used datasets show that our proposed model achieves better performance compared to most existing NAT models, and even achieves comparable translation quality as autoregressive translation models with a significant speedup.
引用
收藏
页码:13727 / 13735
页数:9
相关论文
共 50 条
  • [1] Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation
    Shao, Chenze
    Feng, Yang
    Zhang, Jinchao
    Meng, Fandong
    Chen, Xilin
    Zhou, Jie
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3013 - 3024
  • [2] A Survey of Non-Autoregressive Neural Machine Translation
    Li, Feng
    Chen, Jingxian
    Zhang, Xuejun
    [J]. ELECTRONICS, 2023, 12 (13)
  • [3] Modeling Coverage for Non-Autoregressive Neural Machine Translation
    Shan, Yong
    Feng, Yang
    Shao, Chenze
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [4] Glancing Transformer for Non-Autoregressive Neural Machine Translation
    Qian, Lihua
    Zhou, Hao
    Bao, Yu
    Wang, Mingxuan
    Qiu, Lin
    Zhang, Weinan
    Yu, Yong
    Li, Lei
    [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 1993 - 2003
  • [5] Learning to Rewrite for Non-Autoregressive Neural Machine Translation
    Geng, Xinwei
    Feng, Xiaocheng
    Qin, Bing
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3297 - 3308
  • [6] Imitation Learning for Non-Autoregressive Neural Machine Translation
    Wei, Bingzhen
    Wang, Mingxuan
    Zhou, Hao
    Lin, Junyang
    Sun, Xu
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1304 - 1312
  • [7] Uncertainty-aware non-autoregressive neural machine translation
    Liu, Chuanming
    Yu, Jingqi
    [J]. COMPUTER SPEECH AND LANGUAGE, 2023, 78
  • [8] Non-autoregressive neural machine translation with auxiliary representation fusion
    Du, Quan
    Feng, Kai
    Xu, Chen
    Xiao, Tong
    Zhu, Jingbo
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 41 (06) : 7229 - 7239
  • [9] Improving Non-autoregressive Neural Machine Translation with Monolingual Data
    Zhou, Jiawei
    Keung, Phillip
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1893 - 1898
  • [10] Non-Autoregressive Neural Machine Translation with Enhanced Decoder Input
    Guo, Junliang
    Tan, Xu
    He, Di
    Qin, Tao
    Xu, Linli
    Liu, Tie-Yan
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 3723 - 3730