Comparing Dependency-based Compositional Models with Contextualized Word Embeddings

被引:2
|
作者
Gamallo, Pablo [1 ]
de Prada Corral, Manuel [1 ]
Garcia, Marcos [1 ]
机构
[1] Univ Santiago de Compostela, Ctr Singular Invest Tecnoloxias Intelixentes CiTI, Galiza, Spain
关键词
Compositional Distributional Models; Contextualized Word Embeddings; Transformers; Compositionality; Dependency-based Parsing;
D O I
10.5220/0010391812581265
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this article, we compare two different strategies to contextualize the meaning of words in a sentence: both distributional models that make use of syntax-based methods following the Principle of Compositionality and Transformer technology such as BERT-like models. As the former methods require controlled syntactic structures, the two approaches are compared against datasets with syntactically fixed sentences, namely subject-predicate and subject-predicate-object expressions. The results show that syntax-based compositional approaches working with syntactic dependencies are competitive with neural-based Transformer models, and could have a greater potential when trained and developed using the same resources.
引用
收藏
页码:1258 / 1265
页数:8
相关论文
共 50 条
  • [31] Shahmukhi named entity recognition by using contextualized word embeddings
    Tehseen, Amina
    Ehsan, Toqeer
    Bin Liaqat, Hannan
    Kong, Xiangjie
    Ali, Amjad
    Al-Fuqaha, Ala
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 229
  • [32] Extensive study on the underlying gender bias in contextualized word embeddings
    Basta, Christine
    Costa-jussa, Marta R.
    Casas, Noe
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (08): : 3371 - 3384
  • [33] Dependency-based analysis for Tagalog sentences
    Manguilimotan, Erlyn
    Matsumoto, Yuji
    PACLIC 25 - Proceedings of the 25th Pacific Asia Conference on Language, Information and Computation, 2011, : 343 - 352
  • [34] Learning Word Embeddings from Dependency Relations
    Zhao, Yinggong
    Huang, Shujian
    Dai, Xinyu
    Zhang, Jianbing
    Chen, Jiajun
    PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2014), 2014, : 123 - 127
  • [35] Evaluation of Arabic-Based Contextualized Word Embedding Models
    Yagi, Sane Mo
    Mansour, Youssef
    Kamalov, Firuz
    Elnagar, Ashraf
    2021 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2021, : 200 - 206
  • [36] Automatic Scheduling of Dependency-Based Workflows
    Majeed, Tahir
    Handschuh, Michael
    Meier, Rene
    DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE, 2018, 620 : 309 - 317
  • [37] Dependency-based n-gram models for general purpose sentence realisation
    Guo, Yuqing
    Wang, Haifeng
    Van Genabith, Josef
    NATURAL LANGUAGE ENGINEERING, 2011, 17 : 455 - 483
  • [38] Comparing Word Embeddings through Visualisation
    Santos, Pedro
    Datia, Nuno
    Pato, Matilde
    Sobral, Jose
    2022 26TH INTERNATIONAL CONFERENCE INFORMATION VISUALISATION (IV), 2022, : 91 - 97
  • [39] Text-Aware Predictive Process Monitoring with Contextualized Word Embeddings
    Cabrera, Lena
    Weinzierl, Sven
    Zilker, Sandra
    Matzner, Martin
    BUSINESS PROCESS MANAGEMENT WORKSHOPS, BPM 2022 INTERNATIONAL WORKSHOPS, 2023, 460 : 303 - 314
  • [40] Lost in Context? On the Sense-Wise Variance of Contextualized Word Embeddings
    Wang, Yile
    Zhang, Yue
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 639 - 650