Modeling Fluency and Faithfulness for Diverse Neural Machine Translation

被引:0
|
作者
Feng, Yang [1 ,2 ]
Xie, Wanying [1 ,3 ]
Gu, Shuhao [1 ,2 ]
Shao, Chenze [1 ,2 ]
Zhang, Wen [4 ]
Yang, Zhengxin [1 ,2 ]
Yu, Dong [3 ]
机构
[1] Chinese Acad Sci ICT CAS, Key Lab Intelligent Informat Proc, Inst Comp Technol, Guangzhou, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Beijing Language & Culture Univ, Beijing, Peoples R China
[4] Smart Platform Prod Dept Tencent Inc, Beijing, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural machine translation models usually adopt the teacher forcing strategy for training which requires the predicted sequence matches ground truth word by word and forces the probability of each prediction to approach a 0-1 distribution. However, the strategy casts all the portion of the distribution to the ground truth word and ignores other words in the target vocabulary even when the ground truth word cannot dominate the distribution. To address the problem of teacher forcing, we propose a method to introduce an evaluation module to guide the distribution of the prediction. The evaluation module accesses each prediction from the perspectives of fluency and faithfulness to encourage the model to generate the word which has a fluent connection with its past and future translation and meanwhile tends to form a translation equivalent in meaning to the source. The experiments on multiple translation tasks show that our method can achieve significant improvements over strong baselines.
引用
下载
收藏
页码:59 / 66
页数:8
相关论文
共 50 条
  • [11] Modeling Future Cost for Neural Machine Translation
    Duan, Chaoqun
    Chen, Kehai
    Wang, Rui
    Utiyama, Masao
    Sumita, Eiichiro
    Zhu, Conghui
    Zhao, Tiejun
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 770 - 781
  • [12] Towards More Diverse Input Representation for Neural Machine Translation
    Chen, Kehai
    Wang, Rui
    Utiyama, Masao
    Sumita, Eiichiro
    Zhao, Tiejun
    Yang, Muyun
    Zhao, Hai
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 : 1586 - 1597
  • [13] Diverse Machine Translation with Translation Memory
    Zhang, Yi
    Zhao, Jing
    Sun, Shiliang
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [14] Faithfulness in Translation
    Ma Jianhe
    上海翻译, 1994, (03) : 5 - 8
  • [16] MODELING HOMOPHONE NOISE FOR ROBUST NEURAL MACHINE TRANSLATION
    Qin, Wenjie
    Li, Xiang
    Sun, Yuhui
    Xiong, Deyi
    Cui, Jianwei
    Wang, Bin
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7533 - 7537
  • [17] Towards Modeling the Style of Translators in Neural Machine Translation
    Wang, Yue
    Hoang, Cuong
    Federico, Marcello
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1193 - 1199
  • [18] Explicitly Modeling Word Translations in Neural Machine Translation
    Han, Dong
    Li, Junhui
    Li, Yachao
    Zhang, Min
    Zhou, Guodong
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2020, 19 (01)
  • [19] A Diverse Data Augmentation Strategy for Low-Resource Neural Machine Translation
    Li, Yu
    Li, Xiao
    Yang, Yating
    Dong, Rui
    INFORMATION, 2020, 11 (05)
  • [20] Using Neural Machine Translation for Generating Diverse Challenging Exercises for Language Learners
    Gomez, Frank Palma
    Panda, Subhadarshi
    Flor, Michael
    Rozovskaya, Alla
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 6115 - 6129