Cross-lingual transfer of knowledge in distributional language models: Experiments in Hungarian

被引:1
|
作者
Novak, Attila [1 ]
Novak, Borbala [1 ]
机构
[1] Pazmany Peter Catholic Univ, Fac Informat Technol & Bion, Budapest, Hungary
来源
ACTA LINGUISTICA ACADEMICA | 2022年 / 69卷 / 04期
关键词
distributional vs; generative models of language; zero-shot cross-lingual knowledge transfer; multilingual contextual neural language models; meaning representation parsing; named entity recognition;
D O I
10.1556/2062.2022.00580
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
In this paper, we argue that the very convincing performance of recent deep-neural-model-based NLP applications has demonstrated that the distributionalist approach to language description has proven to be more successful than the earlier subtle rule-based models created by the generative school. The now ubiquitous neural models can naturally handle ambiguity and achieve human-like linguistic performance with most of their training consisting only of noisy raw linguistic data without any multimodal grounding or external supervision refuting Chomsky's argument that some generic neural architecture cannot arrive at the linguistic performance exhibited by humans given the limited input available to children. In addition, we demonstrate in experiments with Hungarian as the target language that the shared internal represen-tations in multilingually trained versions of these models make them able to transfer specific linguistic skills, including structured annotation skills, from one language to another remarkably efficiently.
引用
收藏
页码:405 / 449
页数:45
相关论文
共 50 条
  • [41] Isotropic Representation Can Improve Zero-Shot Cross-Lingual Transfer on Multilingual Language Models
    Ji, Yixin
    Wang, Jikai
    Li, Juntao
    Yee, Hai
    Zhang, Min
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 8104 - 8118
  • [42] Model Selection for Cross-Lingual Transfer
    Chen, Yang
    Ritter, Alan
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 5675 - 5687
  • [43] Transfer language selection for zero-shot cross-lingual abusive language detection
    Eronen, Juuso
    Ptaszynski, Michal
    Masui, Fumito
    Arata, Masaki
    Leliwa, Gniewosz
    Wroczynski, Michal
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (04)
  • [44] Models and Datasets for Cross-Lingual Summarisation
    Perez-Beltrachini, Laura
    Lapata, Mirella
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9408 - 9423
  • [45] Reassessing the value of resources for cross-lingual transfer of POS tagging models
    Pecheux, Nicolas
    Wisniewski, Guillaume
    Yvon, Francois
    LANGUAGE RESOURCES AND EVALUATION, 2017, 51 (04) : 927 - 960
  • [46] Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking
    Schumacher, Elliot
    Mayfield, James
    Dredze, Mark
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 583 - 595
  • [47] Reassessing the value of resources for cross-lingual transfer of POS tagging models
    Nicolas Pécheux
    Guillaume Wisniewski
    François Yvon
    Language Resources and Evaluation, 2017, 51 : 927 - 960
  • [48] Distributional Correspondence Indexing for Cross-Lingual and Cross-Domain Sentiment Classification
    Fernandez, Alejandro Moreo
    Esuli, Andrea
    Sebastiani, Fabrizio
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2016, 55 : 131 - 163
  • [49] Language Anisotropic Cross-Lingual Model Editing
    Xu, Yang
    Hou, Yutai
    Che, Wanxiang
    Zhang, Min
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5554 - 5569
  • [50] Enhancing Cross-lingual Natural Language Inference by Prompt-learning from Cross-lingual Templates
    Qi, Kunxun
    Wan, Hai
    Du, Jianfeng
    Chen, Haolan
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1910 - 1923