Predicting Terms in IS-A Relations with Pre-trained Transformers

被引:0
|
作者
Nikishina, Irina [1 ]
Chernomorchenko, Polina [2 ]
Demidova, Anastasiia [3 ]
Panchenko, Alexander [3 ,4 ]
Biemann, Chris [1 ]
机构
[1] Univ Hamburg, Hamburg, Germany
[2] HSE Univ, Moscow, Russia
[3] Skolkovo Inst Sci & Technol, Moscow, Russia
[4] AIRI, Moscow, Russia
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we explore the ability of the generative transformers to predict objects in IS-A (hypo-hypernym) relations. We solve the task for both directions of the relations: we learn to predict hypernyms given the input word and hyponyms, given the input concept and its neighbourhood from the taxonomy. To the best of our knowledge, this is the first paper which provides a comprehensive analysis of transformerbased models for the task of hypernymy extraction. Apart from the standard finetuning of various generative models, we experiment with different input formats and prefixes, zeroand few-shot learning strategies, and generation parameters. Results show that higher performance on both subtasks can be achieved by generative transformers with no additional data (like definitions or lemma names). Such models have phenomenally high abilities at the task given a little training and proper prompts in comparison to specialized rule-based and statistical methods as well as encoder-based transformer models.
引用
收藏
页码:134 / 148
页数:15
相关论文
共 50 条
  • [31] Do Syntax Trees Help Pre-trained Transformers Extract Information?
    Sachan, Devendra Singh
    Zhang, Yuhao
    Qi, Peng
    Hamilton, William
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 2647 - 2661
  • [32] Unsupervised Out-of-Domain Detection via Pre-trained Transformers
    Xu, Keyang
    Ren, Tongzheng
    Zhang, Shikun
    Feng, Yihao
    Xiong, Caiming
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 1052 - 1061
  • [33] On Checking Robustness on Named Entity Recognition with Pre-trained Transformers Models
    Garcia-Pablos, Aitor
    Mandravickaite, Justina
    Versinskiene, Egidija
    BALTIC JOURNAL OF MODERN COMPUTING, 2023, 11 (04): : 591 - 606
  • [34] Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees
    Bai, Jiangang
    Wang, Yujing
    Chen, Yiren
    Yang, Yaming
    Bai, Jing
    Yu, Jing
    Tong, Yunhai
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3011 - 3020
  • [35] ViTMatte: Boosting image matting with pre-trained plain vision transformers
    Yao, Jingfeng
    Wang, Xinggang
    Yang, Shusheng
    Wang, Baoyuan
    INFORMATION FUSION, 2024, 103
  • [36] Logical Transformers: Infusing Logical Structures into Pre-Trained Language Models
    Wang, Borui
    Huang, Qiuyuan
    Deb, Budhaditya
    Halfaker, Aaron
    Shao, Liqun
    McDuff, Daniel
    Awadallah, Ahmed Hassan
    Radev, Dragomir
    Gao, Jianfeng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 1762 - 1773
  • [37] Finding and Editing Multi-Modal Neurons in Pre-Trained Transformers
    Pan, Haowen
    Cao, Yixin
    Wang, Xiaozhi
    Yang, Xun
    Wang, Meng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 1012 - 1037
  • [38] Fast and accurate Bayesian optimization with pre-trained transformers for constrained engineering problemsFast and accurate Bayesian optimization with pre-trained transformers...R. Yu et al..
    Cyril Picard
    Faez Ahmed
    Structural and Multidisciplinary Optimization, 2025, 68 (3)
  • [39] Harnessing Pre-Trained Sentence Transformers for Offensive Language Detection in Indian Languages
    MKSSS Cummins College of Engineering for Women, Maharashtra, Pune, India
    不详
    不详
    CEUR Workshop Proc., (427-434):
  • [40] Biologically Inspired Design Concept Generation Using Generative Pre-Trained Transformers
    Zhu, Qihao
    Zhang, Xinyu
    Luo, Jianxi
    JOURNAL OF MECHANICAL DESIGN, 2023, 145 (04)