Order-Agnostic Cross Entropy for Non-Autoregressive Machine Translation

被引:0
|
作者
Du, Cunxiao [1 ]
Tu, Zhaopeng [2 ]
Jiang, Jing [1 ]
机构
[1] Singapore Management Univ, Sch Comp & Informat Syst, Singapore, Singapore
[2] Tencent AI Lab, Shenzhen, Peoples R China
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a new training objective named orderagnostic cross entropy (OAXE) for fully non-autoregressive translation (NAT) models. OAXE improves the standard cross-entropy loss to ameliorate the effect of word reordering, which is a common source of the critical multimodality problem in NAT. Concretely, OAXE removes the penalty for word order errors, and computes the cross entropy loss based on the best possible alignment between model predictions and target tokens. Since the log loss is very sensitive to invalid references, we leverage cross entropy initialization and loss truncation to ensure the model focuses on a good part of the search space. Extensive experiments on major WMT benchmarks show that OAXE substantially improves translation performance, setting new state of the art for fully NAT models. Further analyses show that OAXE alleviates the multimodality problem by reducing token repetitions and increasing prediction confidence. Our code, data, and trained models are available at https://github.com/ tencent-ailab/ICML21_OAXE.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Aligned Cross Entropy for Non-Autoregressive Machine Translation
    Ghazvininejad, Marjan
    Karpukhin, Vladimir
    Zettlemoyer, Luke
    Levy, Omer
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [2] Aligned Cross Entropy for Non-Autoregressive Machine Translation
    Ghazvininejad, Marjan
    Karpukhin, Vladimir
    Zettlemoyer, Luke
    Levy, Omer
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [3] Integrating Translation Memories into Non-Autoregressive Machine Translation
    Xu, Jitao
    Crego, Josep
    Yvon, Francois
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1326 - 1338
  • [4] Enhanced encoder for non-autoregressive machine translation
    Wang, Shuheng
    Shi, Shumin
    Huang, Heyan
    [J]. MACHINE TRANSLATION, 2021, 35 (04) : 595 - 609
  • [5] Acyclic Transformer for Non-Autoregressive Machine Translation
    Huang, Fei
    Zhou, Hao
    Liu, Yang
    Li, Hang
    Huang, Minlie
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] Non-Autoregressive Machine Translation with Auxiliary Regularization
    Wang, Yiren
    Tian, Fei
    He, Di
    Qin, Tao
    Zhai, ChengXiang
    Liu, Tie-Yan
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5377 - 5384
  • [7] A Survey of Non-Autoregressive Neural Machine Translation
    Li, Feng
    Chen, Jingxian
    Zhang, Xuejun
    [J]. ELECTRONICS, 2023, 12 (13)
  • [8] Non-Autoregressive Machine Translation with Latent Alignments
    Saharia, Chitwan
    Chan, William
    Saxena, Saurabh
    Norouzi, Mohammad
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1098 - 1108
  • [9] Modeling Coverage for Non-Autoregressive Neural Machine Translation
    Shan, Yong
    Feng, Yang
    Shao, Chenze
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] Incorporating history and future into non-autoregressive machine translation
    Wang, Shuheng
    Huang, Heyan
    Shi, Shumin
    [J]. COMPUTER SPEECH AND LANGUAGE, 2022, 77