Robust Machine Translation with Domain Sensitive Pseudo-Sources: Baidu-OSU WMT19 MT Robustness Shared Task System Report

被引:0
|
作者
Zheng, Renjie [2 ]
Liu, Hairong [1 ]
Ma, Mingbo [1 ]
Zheng, Baigong [1 ]
Huang, Liang [1 ,2 ]
机构
[1] Baidu Res, Sunnyvale, CA USA
[2] Oregon State Univ, Sch EECS, Corvallis, OR 97331 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper describes the machine translation system developed jointly by Baidu Research and Oregon State University for WMT 2019 Machine Translation Robustness Shared Task. Translation of social media is a very challenging problem, since its style is very different from normal parallel corpora (e.g. News) and also include various types of noises. To make it worse, the amount of social media parallel corpora is extremely limited. In this paper, we use a domain sensitive training method which leverages a large amount of parallel data from popular domains together with a little amount of parallel data from social media. Furthermore, we generate a parallel dataset with pseudo noisy source sentences which are back-translated from monolingual data using a model trained by a similar domain sensitive way. We achieve more than 10 BLEU improvement in both En-Fr and Fr-En translation compared with the baseline methods.
引用
收藏
页码:559 / 564
页数:6
相关论文
empty
未找到相关数据