AmericasNLI: Machine translation and natural language inference systems for Indigenous languages of the Americas

被引:4
|
作者
Kann, Katharina [1 ]
Ebrahimi, Abteen [1 ]
Mager, Manuel [2 ]
Oncevay, Arturo [3 ]
Ortega, John E. [4 ]
Rios, Annette [5 ]
Fan, Angela [6 ]
Gutierrez-Vasques, Ximena [7 ]
Chiruzzo, Luis [8 ]
Gimenez-Lugo, Gustavo A. [9 ]
Ramos, Ricardo [10 ]
Ruiz, Ivan Vladimir Meza [11 ]
Mager, Elisabeth [12 ]
Chaudhary, Vishrav [13 ]
Neubig, Graham [14 ]
Palmer, Alexis [15 ]
Coto-Solano, Rolando [16 ]
Vu, Ngoc Thang [2 ]
机构
[1] Univ Colorado, Dept Comp Sci, Boulder, CO 80309 USA
[2] Univ Stuttgart, Inst Nat Language Proc, Stuttgart, Germany
[3] Univ Edinburgh, Sch Informat, Edinburgh, Midlothian, Scotland
[4] NYU, Courant Inst Math Sci, New York, NY USA
[5] Univ Zurich, Inst Comp Linguist, Zurich, Switzerland
[6] Facebook AI Res, Menlo Pk, CA USA
[7] Univ Zurich, URPP Language & Space, Zurich, Switzerland
[8] Univ Republica, Inst Computat, Montevideo, Uruguay
[9] Univ Tecnol Fed Parana, Dept Informat, Curitiba, Parana, Brazil
[10] Univ Tecnol Tlaxcala, Huamantla, Mexico
[11] Univ Nacl Autonoma Mexico, Dept Comp Sci, Mexico City, DF, Mexico
[12] Univ Nacl Autonoma Mexico, Fac Estudios Super Acatlan, Mexico City, DF, Mexico
[13] Microsoft Turing Res, Redmond, WA USA
[14] Carnegie Mellon Univ, Language Technol Inst, Pittsburgh, PA USA
[15] Univ Colorado, Dept Linguist, Boulder, CO USA
[16] Dartmouth Coll, Dept Linguist, Hanover, NH USA
来源
关键词
natural language processing; multilingual NLP; low-resource languages; natural language inference; machine translation; pretrained models; model adaptation;
D O I
10.3389/frai.2022.995667
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Little attention has been paid to the development of human language technology for truly low-resource languages-i.e., languages with limited amounts of digitally available text data, such as Indigenous languages. However, it has been shown that pretrained multilingual models are able to perform crosslingual transfer in a zero-shot setting even for low-resource languages which are unseen during pretraining. Yet, prior work evaluating performance on unseen languages has largely been limited to shallow token-level tasks. It remains unclear if zero-shot learning of deeper semantic tasks is possible for unseen languages. To explore this question, we present AmericasNLI, a natural language inference dataset covering 10 Indigenous languages of the Americas. We conduct experiments with pretrained models, exploring zero-shot learning in combination with model adaptation. Furthermore, as AmericasNLI is a multiway parallel dataset, we use it to benchmark the performance of different machine translation models for those languages. Finally, using a standard transformer model, we explore translation-based approaches for natural language inference. We find that the zero-shot performance of pretrained models without adaptation is poor for all languages in AmericasNLI, but model adaptation via continued pretraining results in improvements. All machine translation models are rather weak, but, surprisingly, translation-based approaches to natural language inference outperform all other models on that task.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] Modelling of Multiple Target Machine Translation of Controlled Languages Based on Language Norms and Divergences
    Cardey, Sylviane
    Greenfield, Peter
    Anantalapochai, Raksi
    Beddar, Moliand
    DeVitre, Dilber
    Jin, Gan
    PROCEEDINGS OF THE SECOND INTERNATIONAL SYMPOSIUM ON UNIVERSAL COMMUNICATION, 2008, : 322 - +
  • [42] Creating wide networks of support for Indigenous languages: The workshop on Community-based Language Research Across the Americas (CBLRAA)
    Riestenberg, Katherine J.
    Chee, Melvatha
    Granadillo, Tania
    Bischoff, Shannon
    LANGUAGE DOCUMENTATION & CONSERVATION, 2022, 27 : 155 - 164
  • [43] The Paradox of the Compositionality of Natural Language: A Neural Machine Translation Case Study
    Dankers, Verna
    Bruni, Elia
    Hupkes, Dieuwke
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4154 - 4175
  • [44] Improved neural machine translation using Natural Language Processing (NLP)
    Ahammad, Sk Hasane
    Kalangi, Ruth Ramya
    Nagendram, S.
    Inthiyaz, Syed
    Priya, P. Poorna
    Faragallah, Osama S.
    Mohammad, Alsharef
    Eid, Mahmoud M. A.
    Rashed, Ahmed Nabih Zaki
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (13) : 39335 - 39348
  • [45] Chinese Natural Language Processing: From Text Categorization to Machine Translation
    Peng, Haitao
    Applied Mathematics and Nonlinear Sciences, 2024, 9 (01)
  • [46] Improved neural machine translation using Natural Language Processing (NLP)
    Sk Hasane Ahammad
    Ruth Ramya Kalangi
    S. Nagendram
    Syed Inthiyaz
    P. Poorna Priya
    Osama S. Faragallah
    Alsharef Mohammad
    Mahmoud M. A. Eid
    Ahmed Nabih Zaki Rashed
    Multimedia Tools and Applications, 2024, 83 : 39335 - 39348
  • [47] Rationalization: A Neural Machine Translation Approach to Generating Natural Language Explanations
    Ehsan, Upol
    Harrison, Brent
    Chan, Larry
    Riedl, Mark O.
    PROCEEDINGS OF THE 2018 AAAI/ACM CONFERENCE ON AI, ETHICS, AND SOCIETY (AIES'18), 2018, : 81 - 87
  • [48] Formalization of Natural Language into PPTL Specification via Neural Machine Translation
    Li, Chunyi
    Chang, Jiajun
    Wang, Xiaobing
    Zhao, Liang
    Mao, Wenjie
    STRUCTURED OBJECT-ORIENTED FORMAL LANGUAGE AND METHOD, SOFL+MSVL 2022, 2023, 13854 : 79 - 92
  • [49] Comparative Analysis of Translation Systems from Indian Languages to Indian Sign Language
    Singh G.
    Goyal V.
    Goyal L.
    SN Computer Science, 3 (4)
  • [50] Towards incorporating language morphology into statistical machine translation systems
    Karageorgakis, P
    Potamianos, A
    Klasinas, I
    2005 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU), 2005, : 80 - 85