Similarity-aware neural machine translation: reducing human translator efforts by leveraging high-potential sentences with translation memory

被引:7
|
作者
Zhang, Tianfu [1 ]
Huang, Heyan [1 ]
Feng, Chong [1 ]
Wei, Xiaochi [2 ]
机构
[1] Beijing Inst Technol, Beijing, Peoples R China
[2] Baidu Inc, Beijing, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2020年 / 32卷 / 23期
基金
中国国家自然科学基金;
关键词
Neural machine translation; Translation memory; High-potential sentences; Human translator efforts;
D O I
10.1007/s00521-020-04939-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In computer-aided translation tasks, reducing the time of reviewing and post-editing on translations is meaningful for human translators. However, existing studies mainly aim to improve overall translation quality, which only reduces post-editing time. In this work, we firstly identify testing sentences which are highly similar to training set (high-potential sentences) to reduce reviewing time, then we focus on improving corresponding translation quality greatly to reduce post-editing time. From this point, we firstly propose two novel translation memory methods to characterize similarity between sentences on syntactic and template dimensions separately. Based on that, we propose a similarity-aware neural machine translation (similarity-NMT) which consists of two independent modules: (1) Identification Module, which can identify high-potential sentences of testing set according to multi-dimensional similarity information; (2) Translation Module, which can integrate multi-dimensional similarity information of parallel training sentence pairs into an attention-based NMT model by leveraging posterior regularization. Experiments on two Chinese English domains have well-validated the effectiveness and universality of the proposed method of reducing human translator efforts.
引用
收藏
页码:17623 / 17635
页数:13
相关论文
共 3 条
  • [1] Similarity-aware neural machine translation: reducing human translator efforts by leveraging high-potential sentences with translation memory
    Tianfu Zhang
    Heyan Huang
    Chong Feng
    Xiaochi Wei
    Neural Computing and Applications, 2020, 32 : 17623 - 17635
  • [2] Word Position Aware Translation Memory for Neural Machine Translation
    He, Qiuxiang
    Huang, Guoping
    Liu, Lemao
    Li, Li
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 367 - 379
  • [3] Towards Formality-Aware Neural Machine Translation by Leveraging Context Information
    Kim, Dohee
    Baek, Yujin
    Yang, Soyoung
    Choo, Jaegul
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 7384 - 7392