Pretrain-KGE: Learning Knowledge Representation from Pretrained Language Models

被引:0
|
作者
Zhang, Zhiyuan [1 ]
Liu, Xiaoqian [1 ,2 ]
Zhang, Yi [1 ]
Su, Qi [1 ,2 ]
Sun, Xu [1 ]
He, Bin [3 ]
机构
[1] Peking Univ, Sch EECS, MOE Key Lab Computat Linguist, Beijing, Peoples R China
[2] Peking Univ, Sch Foreign Languages, Beijing, Peoples R China
[3] Huawei Noahs Ark Lab, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conventional knowledge graph embedding (KGE) often suffers from limited knowledge representation, leading to performance degradation especially on the low-resource problem. To remedy this, we propose to enrich knowledge representation via pretrained language models by leveraging world knowledge from pretrained models. Specifically, we present a universal training framework named Pretrain-KGE consisting of three phases: semantic-based fine-tuning phase, knowledge extracting phase and KGE training phase. Extensive experiments show that our proposed Pretrain-KGE can improve results over KGE models, especially on solving the low-resource problem.
引用
收藏
页码:259 / 266
页数:8
相关论文
共 50 条
  • [31] Relational World Knowledge Representation in Contextual Language Models: A Review
    Safavi, Tara
    Koutra, Danai
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1053 - 1067
  • [32] Dynamic Heterogeneous-Graph Reasoning with Language Models and Knowledge Representation Learning for Commonsense Question Answering
    Wang, Yujie
    Zhang, Hu
    Liang, Jiye
    Li, Ru
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 14048 - 14063
  • [33] Constructing Chinese taxonomy trees from understanding and generative pretrained language models
    Guo, Jianyu
    Chen, Jingnan
    Ren, Li
    Zhou, Huanlai
    Xu, Wenbo
    Jia, Haitao
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [34] Constructing Chinese taxonomy trees from understanding and generative pretrained language models
    Guo, Jianyu
    Chen, Jingnan
    Ren, Li
    Zhou, Huanlai
    Xu, Wenbo
    Jia, Haitao
    PeerJ Computer Science, 2024, 10
  • [35] PRCBERT: Prompt Learning for Requirement Classification using BERT-based Pretrained Language Models
    Luo, Xianchang
    Xue, Yinxing
    Xing, Zhenchang
    Sun, Jiamou
    PROCEEDINGS OF THE 37TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE 2022, 2022,
  • [36] SELFormer: molecular representation learning via SELFIES language models
    Yuksel, Atakan
    Ulusoy, Erva
    Unlu, Atabey
    Dogan, Tunca
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2023, 4 (02):
  • [37] Using Large Pretrained Language Models for Answering User Queries from Product Specifications
    Roy, Kalyani
    Shah, Smit
    Pai, Nithish
    Ramtej, Jaidam
    Nadkarn, Prajit Prashant
    Banerjee, Jyotirmoy
    Goyal, Pawan
    Kumar, Surender
    WORKSHOP ON E-COMMERCE AND NLP (ECNLP 3), 2020, : 35 - 39
  • [38] LEVERAGING ACOUSTIC AND LINGUISTIC EMBEDDINGS FROM PRETRAINED SPEECH AND LANGUAGE MODELS FOR INTENT CLASSIFICATION
    Sharma, Bidisha
    Madhavi, Maulik
    Li, Haizhou
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7498 - 7502
  • [39] Unsupervised and Few-Shot Parsing from Pretrained Language Models (Extended Abstract)
    Zeng, Zhiyuan
    Xiong, Deyi
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 6995 - 7000
  • [40] Leveraging pretrained language models for seizure frequency extraction from epilepsy evaluation reports
    Rashmie Abeysinghe
    Shiqiang Tao
    Samden D. Lhatoo
    Guo-Qiang Zhang
    Licong Cui
    npj Digital Medicine, 8 (1)