Multi-Task Neural Model for Agglutinative Language Translation

被引:0
|
作者
Pan, Yirong [1 ,2 ,3 ]
Li, Xiao [1 ,2 ,3 ]
Yang, Yating [1 ,2 ,3 ]
Dong, Rui [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, Xinjiang Tech Inst Phys & Chem, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Xinjiang Lab Minor Speech & Language Informat Pro, Urumqi, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural machine translation (NMT) has achieved impressive performance recently by using large-scale parallel corpora. However, it struggles in the low-resource and morphologically-rich scenarios of agglutinative language translation task. Inspired by the finding that monolingual data can greatly improve the NMT performance, we propose a multi-task neural model that jointly learns to perform bi-directional translation and agglutinative language stemming Our approach employs the shared encoder and decoder to train a single model without changing the standard NMT architecture but instead adding a token before each source-side sentence to specify the desired target outputs of the two different tasks. Experimental results on Turkish-English and Uyghur-Chinese show that our proposed approach can significantly improve the translation performance on agglutinative languages by using a small amount of monolingual data.
引用
收藏
页码:103 / 110
页数:8
相关论文
共 50 条
  • [1] Multi-Source Neural Model for Machine Translation of Agglutinative Language
    Pan, Yirong
    Li, Xiao
    Yang, Yating
    Dong, Rui
    [J]. FUTURE INTERNET, 2020, 12 (06):
  • [2] Multi-Task Learning for Multiple Language Translation
    Dong, Daxiang
    Wu, Hua
    He, Wei
    Yu, Dianhai
    Wang, Haifeng
    [J]. PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 1723 - 1732
  • [3] Multi-task Learning for Multilingual Neural Machine Translation
    Wang, Yiren
    Zhai, ChengXiang
    Awadalla, Hany Hassan
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1022 - 1034
  • [4] Scheduled Multi-task Learning for Neural Chat Translation
    Liang, Yunlong
    Meng, Fandong
    Xu, Jinan
    Chen, Yufeng
    Zhou, Jie
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4375 - 4388
  • [5] Training Flexible Depth Model by Multi-Task Learning for Neural Machine Translation
    Wang, Qiang
    Xiao, Tong
    Zhu, Jingbo
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4307 - 4312
  • [6] Improving Robustness of Neural Machine Translation with Multi-task Learning
    Zhou, Shuyan
    Zeng, Xiangkai
    Zhou, Yingqi
    Anastasopoulos, Antonios
    Neubig, Graham
    [J]. FOURTH CONFERENCE ON MACHINE TRANSLATION (WMT 2019), 2019, : 565 - 571
  • [7] Multi-Task Deep Neural Networks for Natural Language Understanding
    Liu, Xiaodong
    He, Pengcheng
    Chen, Weizhu
    Gao, Jianfeng
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 4487 - 4496
  • [8] Exploring the Advantages of Corpus in Neural Machine Translation of Agglutinative Language
    Ji, Yatu
    Hou, Hongxu
    Wu, Nier
    Chen, Junjie
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: TEXT AND TIME SERIES, PT IV, 2019, 11730 : 326 - 336
  • [9] A Multi-Task Architecture on Relevance-based Neural Query Translation
    Sarwar, Sheikh Muhammad
    Bonab, Hamed
    Allan, James
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 6339 - 6344
  • [10] Neural Machine Translation Based on Multi-task Learning of Discourse Structure
    Kang, Xiao-Mian
    Zong, Cheng-Qing
    [J]. Ruan Jian Xue Bao/Journal of Software, 2022, 33 (10): : 3806 - 3818