Cross-Lingual Pre-Training Based Transfer for Zero-Shot Neural Machine Translation

被引:0
|
作者
Ji, Baijun [2 ]
Zhang, Zhirui [3 ]
Duan, Xiangyu [1 ,2 ]
Zhang, Min [1 ,2 ]
Chen, Boxing [3 ]
Luo, Weihua [3 ]
机构
[1] Soochow Univ, Inst Artificial Intelligence, Suzhou, Peoples R China
[2] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Peoples R China
[3] Alibaba DAMO Acad, Hangzhou, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transfer learning between different language pairs has shown its effectiveness for Neural Machine Translation (NMT) in low-resource scenario. However, existing transfer methods involving a common target language are far from success in the extreme scenario of zero-shot translation, due to the language space mismatch problem between transferor (the parent model) and transferee (the child model) on the source side. To address this challenge, we propose an effective transfer learning approach based on cross-lingual pre-training. Our key idea is to make all source languages share the same feature space and thus enable a smooth transition for zero-shot translation. To this end, we introduce one monolingual pre-training method and two bilingual pre-training methods to obtain a universal encoder for different languages. Once the universal encoder is constructed, the parent model built on such encoder is trained with large-scale annotated data and then directly applied in zero-shot translation scenario. Experiments on two public datasets show that our approach significantly outperforms strong pivot-based baseline and various multilingual NMT approaches.
引用
收藏
页码:115 / 122
页数:8
相关论文
共 50 条
  • [1] Towards Making the Most of Cross-Lingual Transfer for Zero-Shot Neural Machine Translation
    Chen, Guanhua
    Ma, Shuming
    Chen, Yun
    Zhang, Dongdong
    Pan, Jia
    Wang, Wenping
    Wei, Furu
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 142 - 157
  • [2] Zero-Shot Cross-Lingual Transfer of Neural Machine Translation with Multilingual Pretrained Encoders
    Chen, Guanhua
    Ma, Shuming
    Chen, Yun
    Dong, Li
    Zhang, Dongdong
    Pan, Jia
    Wang, Wenping
    Wei, Furu
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 15 - 26
  • [3] Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models
    Huang, Po-Yao
    Patrick, Mandela
    Hu, Junjie
    Neubig, Graham
    Metze, Florian
    Hauptmann, Alexander
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2443 - 2459
  • [4] Zero-Shot Neural Transfer for Cross-Lingual Entity Linking
    Rijhwani, Shruti
    Xie, Jiateng
    Neubig, Graham
    Carbonell, Jaime
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 6924 - 6931
  • [5] Cross-lingual Visual Pre-training for Multimodal Machine Translation
    Caglayan, Ozan
    Kuyu, Menekse
    Amac, Mustafa Sercan
    Madhyastha, Pranava
    Erdem, Erkut
    Erdem, Aykut
    Specia, Lucia
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 1317 - 1324
  • [6] Explicit Cross-lingual Pre-training for Unsupervised Machine Translation
    Ren, Shuo
    Wu, Yu
    Liu, Shujie
    Zhou, Ming
    Ma, Shuai
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 770 - 779
  • [7] Zero-Shot Cross-Lingual Neural Headline Generation
    Ayana
    Shen, Shi-qi
    Chen, Yun
    Yang, Cheng
    Liu, Zhi-yuan
    Sun, Mao-song
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (12) : 2319 - 2327
  • [8] Neural Machine Translation Based on XLM-R Cross-lingual Pre-training Language Model
    Wang Q.
    Li M.
    Wu S.
    Wang M.
    Beijing Daxue Xuebao (Ziran Kexue Ban)/Acta Scientiarum Naturalium Universitatis Pekinensis, 2022, 58 (01): : 29 - 36
  • [9] Zero-Shot Cross-Lingual Transfer with Meta Learning
    Nooralahzadeh, Farhad
    Bekoulis, Giannis
    Bjerva, Johannes
    Augenstein, Isabelle
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4547 - 4562
  • [10] Improving Zero-Shot Cross-Lingual Transfer Learning via Robust Training
    Huang, Kuan-Hao
    Ahmad, Wasi Uddin
    Peng, Nanyun
    Chang, Kai-Wei
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1684 - 1697