Rephrasing the Reference for Non-autoregressive Machine Translation

被引:0
|
作者
Shao, Chenze [1 ,2 ]
Zhang, Jinchao [3 ]
Zhou, Jie [3 ]
Feng, Yang [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Tencent Inc, Pattern Recognit Ctr, WeChat AI, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Non-autoregressive neural machine translation (NAT) models suffer from the multi-modality problem that there may exist multiple possible translations of a source sentence, so the reference sentence may be inappropriate for the training when the NAT output is closer to other translations. In response to this problem, we introduce a rephraser to provide a better training target for NAT by rephrasing the reference sentence according to the NAT output. As we train NAT based on the rephraser output rather than the reference sentence, the rephraser output should fit well with the NAT output and not deviate too far from the reference, which can be quantified as reward functions and optimized by reinforcement learning. Experiments on major WMT benchmarks and NAT baselines show that our approach consistently improves the translation quality of NAT. Specifically, our best variant achieves comparable performance to the autoregressive Transformer, while being 14.7 times more efficient in inference.
引用
收藏
页码:13538 / 13546
页数:9
相关论文
共 50 条
  • [41] Guiding Non-Autoregressive Neural Machine Translation Decoding with Reordering Information
    Ran, Qiu
    Lin, Yankai
    Li, Peng
    Zhou, Jie
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 13727 - 13735
  • [42] Incorporating a local translation mechanism into non-autoregressive translation
    Kong, Xiang
    Zhang, Zhisong
    Hovy, Eduard
    arXiv, 2020,
  • [43] Incorporating a Local Translation Mechanism into Non-autoregressive Translation
    Kong, Xiang
    Zhang, Zhisong
    Hovy, Eduard
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1067 - 1073
  • [44] Non-autoregressive Streaming Transformer for Simultaneous Translation
    Ma, Zhengrui
    Zhang, Shaolei
    Guo, Shoutao
    Shao, Chenze
    Zhang, Min
    Feng, Yang
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 5177 - 5190
  • [45] Enriching Non-Autoregressive Transformer with Syntactic and Semantic Structures for Neural Machine Translation
    Liu, Ye
    Wan, Yao
    Zhang, Jian-Guo
    Zhao, Wenting
    Yu, Philip S.
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 1235 - 1244
  • [46] Task-Level Curriculum Learning for Non-Autoregressive Neural Machine Translation
    Liu, Jinglin
    Ren, Yi
    Tan, Xu
    Zhang, Chen
    Qin, Tao
    Zhao, Zhou
    Liu, Tie-Yan
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3861 - 3867
  • [47] DiMS: Distilling Multiple Steps of Iterative Non-Autoregressive Transformers for Machine Translation
    Norouzi, Sajad
    Hosseinzadeh, Rasa
    Perez, Felipe
    Volkovs, Maksims
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 8538 - 8553
  • [48] Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation
    Shao, Chenze
    Zhang, Jinchao
    Feng, Yang
    Meng, Fandong
    Zhou, Jie
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 198 - 205
  • [49] Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation
    Hao, Yongchang
    He, Shilin
    Jiao, Wenxiang
    Tu, Zhaopeng
    Lyu, Michael R.
    Wang, Xing
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 3989 - 3996
  • [50] Non-Autoregressive Neural Machine Translation with Consistency Regularization Optimized Variational Framework
    Zhu, Minghao
    Wang, Junli
    Yan, Chungang
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 607 - 617