XLIT: A Method to Bridge Task Discrepancy in Machine Translation Pre-training

被引:0
|
作者
Pham, Khang [1 ,2 ]
Nguyen, Long [1 ,2 ]
Dinh, Dien [1 ,2 ]
机构
[1] Faculty of Information Technology, University of Science, Ho Chi Minh, Viet Nam
[2] Vietnam National University, Ho Chi Minh City, Viet Nam
关键词
D O I
10.1145/3689630
中图分类号
学科分类号
摘要
引用
收藏
相关论文
共 50 条
  • [21] Multilingual Translation from Denoising Pre-Training
    Tang, Yuqing
    Tran, Chau
    Li, Xian
    Chen, Peng-Jen
    Goyal, Naman
    Chaudhary, Vishrav
    Gu, Jiatao
    Fan, Angela
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3450 - 3466
  • [22] A Comparison between Pre-training and Large-scale Back-translation for Neural Machine Translation
    Huang, Dandan
    Wang, Kun
    Zhang, Yue
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 1718 - 1732
  • [23] JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation
    Mao, Zhuoyuan
    Cromieres, Fabien
    Dabre, Raj
    Song, Haiyue
    Kurohashi, Sadao
    [J]. PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 3683 - 3691
  • [24] Multilingual Pre-training Model-Assisted Contrastive Learning Neural Machine Translation
    Sun, Shuo
    Hou, Hong-xu
    Yang, Zong-heng
    Wang, Yi-song
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [25] Low-Resource Neural Machine Translation Using XLNet Pre-training Model
    Wu, Nier
    Hou, Hongxu
    Guo, Ziyue
    Zheng, Wei
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 503 - 514
  • [26] Exploring the Role of Monolingual Data in Cross-Attention Pre-training for Neural Machine Translation
    Khang Pham
    Long Nguyen
    Dien Dinh
    [J]. COMPUTATIONAL COLLECTIVE INTELLIGENCE, ICCCI 2023, 2023, 14162 : 179 - 190
  • [27] Character-Aware Low-Resource Neural Machine Translation with Weight Sharing and Pre-training
    Cao, Yichao
    Li, Miao
    Feng, Tao
    Wang, Rujing
    [J]. CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 321 - 333
  • [28] Exploring the diversity and invariance in yourself for visual pre-training task
    Wei, Longhui
    Xie, Lingxi
    Zhou, Wengang
    Li, Houqiang
    Tian, Qi
    [J]. PATTERN RECOGNITION, 2023, 139
  • [29] Curriculum Pre-training for End-to-End Speech Translation
    Wang, Chengyi
    Wu, Yu
    Liu, Shujie
    Zhou, Ming
    Yang, Zhenglu
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3728 - 3738
  • [30] Product-oriented Machine Translation with Cross-modal Cross-lingual Pre-training
    Song, Yuqing
    Chen, Shizhe
    Jin, Qin
    Luo, Wei
    Xie, Jun
    Huang, Fei
    [J]. PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2843 - 2852