Linguistically Driven Multi-Task Pre-Training for Low-Resource Neural Machine Translation

被引:6
|
作者
Mao, Zhuoyuan [1 ]
Chu, Chenhui [1 ]
Kurohashi, Sadao [1 ]
机构
[1] Kyoto Univ, Grad Sch Informat, Kyoto, Japan
关键词
Low-resource neural machine translation; pre-training; linguistically-driven;
D O I
10.1145/3491065
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the present study, we propose novel sequence-to-sequence pre-training objectives for low-resource machine translation (NMT): Japanese-specific sequence to sequence (JASS) for language pairs involving Japanese as the source or target language, and English-specific sequence to sequence (ENSS) for language pairs involving English. JASS focuses on masking and reordering Japanese linguistic units known as bunsetsu, whereas ENSS is proposed based on phrase structure masking and reordering tasks. Experiments on ASPEC Japanese-English & Japanese-Chinese, Wikipedia Japanese-Chinese, News English-Korean corpora demonstrate that JASS and ENSS outperform MASS and other existing language-agnostic pre-training methods by up to +2.9 BLEU points for the Japanese-English tasks, up to +7.0 BLEU points for the Japanese-Chinese tasks and up to +1.3 BLEU points for English-Korean tasks. Empirical analysis, which focuses on the relationship between individual parts in JASS and ENSS, reveals the complementary nature of the subtasks of JASS and ENSS. Adequacy evaluation using LASER, human evaluation, and case studies reveals that our proposed methods significantly outperform pre-training methods without injected linguistic knowledge and they have a larger positive impact on the adequacy as compared to the fluency.
引用
收藏
页数:29
相关论文
共 50 条
  • [21] Investigating the Pre-Training Bias in Low-Resource Abstractive Summarization
    Chernyshev, Daniil
    Dobrov, Boris
    IEEE ACCESS, 2024, 12 : 47219 - 47230
  • [22] Pre-training on High-Resource Speech Recognition Improves Low-Resource Speech-to-Text Translation
    Bansal, Sameer
    Kamper, Herman
    Livescu, Karen
    Lopez, Adam
    Goldwater, Sharon
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 58 - 68
  • [23] Low-Resource Neural Machine Translation with Neural Episodic Control
    Wu, Nier
    Hou, Hongxu
    Sun, Shuo
    Zheng, Wei
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [24] Multi-task Learning for Multilingual Neural Machine Translation
    Wang, Yiren
    Zhai, ChengXiang
    Awadalla, Hany Hassan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1022 - 1034
  • [25] Multi-granularity Knowledge Sharing in Low-resource Neural Machine Translation
    Mi, Chenggang
    Xie, Shaoliang
    Fan, Yi
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (02)
  • [26] Low-resource Neural Machine Translation: Methods and Trends
    Shi, Shumin
    Wu, Xing
    Su, Rihai
    Huang, Heyan
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (05)
  • [27] low-resource neural Machine translation with Multi-strategy prototype generation
    Yu Z.-Q.
    Yu Z.-T.
    Huang Y.-X.
    Guo J.-J.
    Xian Y.-T.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (11): : 5113 - 5125
  • [28] Recent advances of low-resource neural machine translation
    Haque, Rejwanul
    Liu, Chao-Hong
    Way, Andy
    MACHINE TRANSLATION, 2021, 35 (04) : 451 - 474
  • [29] Neural Machine Translation for Low-resource Languages: A Survey
    Ranathunga, Surangika
    Lee, En-Shiun Annie
    Skenduli, Marjana Prifti
    Shekhar, Ravi
    Alam, Mehreen
    Kaur, Rishemjit
    ACM COMPUTING SURVEYS, 2023, 55 (11)
  • [30] Data Augmentation for Low-Resource Neural Machine Translation
    Fadaee, Marzieh
    Bisazza, Arianna
    Monz, Christof
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 2, 2017, : 567 - 573