Non-Autoregressive Translation by Learning Target Categorical Codes

被引:0
|
作者
Bao, Yu [1 ,2 ]
Huang, Shujian [1 ,2 ]
Xiao, Tong [3 ]
Wang, Dongqi [1 ,2 ]
Dai, Xinyu [1 ,2 ]
Chen, Jiajun [1 ,2 ]
机构
[1] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing, Peoples R China
[2] Collaborat Innovat Ctr Novel Software Technol & I, Suzhou, Peoples R China
[3] NiuTrans Co Ltd, Shenyang, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Non-autoregressive Transformer is a promising text generation model. However, current non-autoregressive models still fall behind their autoregressive counterparts in translation quality. We attribute this accuracy gap to the lack of dependency modeling among decoder inputs. In this paper, we propose CNAT, which learns implicitly categorical codes as latent variables into the non-autoregressive decoding. The interaction among these categorical codes remedies the missing dependencies and improves the model capacity. Experiment results show that our model achieves comparable or better performance in machine translation tasks than several strong baselines.
引用
收藏
页码:5749 / 5759
页数:11
相关论文
共 50 条
  • [1] Learning to Rewrite for Non-Autoregressive Neural Machine Translation
    Geng, Xinwei
    Feng, Xiaocheng
    Qin, Bing
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3297 - 3308
  • [2] Imitation Learning for Non-Autoregressive Neural Machine Translation
    Wei, Bingzhen
    Wang, Mingxuan
    Zhou, Hao
    Lin, Junyang
    Sun, Xu
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1304 - 1312
  • [3] Incorporating a local translation mechanism into non-autoregressive translation
    Kong, Xiang
    Zhang, Zhisong
    Hovy, Eduard
    [J]. arXiv, 2020,
  • [4] Integrating Translation Memories into Non-Autoregressive Machine Translation
    Xu, Jitao
    Crego, Josep
    Yvon, Francois
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1326 - 1338
  • [5] On the Learning of Non-Autoregressive Transformers
    Huang, Fei
    Tao, Tianhua
    Zhou, Hao
    Li, Lei
    Huang, Minlie
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] Incorporating a Local Translation Mechanism into Non-autoregressive Translation
    Kong, Xiang
    Zhang, Zhisong
    Hovy, Eduard
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1067 - 1073
  • [7] Enhanced encoder for non-autoregressive machine translation
    Wang, Shuheng
    Shi, Shumin
    Huang, Heyan
    [J]. MACHINE TRANSLATION, 2021, 35 (04) : 595 - 609
  • [8] Acyclic Transformer for Non-Autoregressive Machine Translation
    Huang, Fei
    Zhou, Hao
    Liu, Yang
    Li, Hang
    Huang, Minlie
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [9] Non-Autoregressive Machine Translation with Auxiliary Regularization
    Wang, Yiren
    Tian, Fei
    He, Di
    Qin, Tao
    Zhai, ChengXiang
    Liu, Tie-Yan
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5377 - 5384
  • [10] Non-Autoregressive Machine Translation with Latent Alignments
    Saharia, Chitwan
    Chan, William
    Saxena, Saurabh
    Norouzi, Mohammad
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1098 - 1108