Pretrain-KGE: Learning Knowledge Representation from Pretrained Language Models

被引:0
|
作者
Zhang, Zhiyuan [1 ]
Liu, Xiaoqian [1 ,2 ]
Zhang, Yi [1 ]
Su, Qi [1 ,2 ]
Sun, Xu [1 ]
He, Bin [3 ]
机构
[1] Peking Univ, Sch EECS, MOE Key Lab Computat Linguist, Beijing, Peoples R China
[2] Peking Univ, Sch Foreign Languages, Beijing, Peoples R China
[3] Huawei Noahs Ark Lab, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conventional knowledge graph embedding (KGE) often suffers from limited knowledge representation, leading to performance degradation especially on the low-resource problem. To remedy this, we propose to enrich knowledge representation via pretrained language models by leveraging world knowledge from pretrained models. Specifically, we present a universal training framework named Pretrain-KGE consisting of three phases: semantic-based fine-tuning phase, knowledge extracting phase and KGE training phase. Extensive experiments show that our proposed Pretrain-KGE can improve results over KGE models, especially on solving the low-resource problem.
引用
收藏
页码:259 / 266
页数:8
相关论文
共 50 条
  • [11] X-FACTR: Multilingual Factual Knowledge Retrieval from Pretrained Language Models
    Zhengbao, Jiang
    Anastasopoulos, Antonios
    Jun, Araki
    Haibo, Ding
    Neubig, Graham
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 5943 - 5959
  • [12] Commonsense Knowledge Mining from Pretrained Models
    Feldman, Joshua
    Davison, Joe
    Rush, Alexander M.
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1173 - 1178
  • [13] Learning protein language contrastive models with multi-knowledge representation
    Xu, Wenjun
    Xia, Yingchun
    Sun, Bifan
    Zhao, Zihao
    Tang, Lianggui
    Zhou, Obo
    Wang, Qingyong
    Gu, Lichuan
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 164
  • [14] Robust Transfer Learning with Pretrained Language Models through Adapters
    Han, Wenjuan
    Pang, Bo
    Wu, Yingnian
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 854 - 861
  • [15] Parameter-efficient online knowledge distillation for pretrained language models
    Wang, Yukun
    Wang, Jin
    Zhang, Xuejie
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 265
  • [16] Enhancing pretrained language models with structured commonsense knowledge for textual inference
    Du, Li
    Ding, Xiao
    Xiong, Kai
    Liu, Ting
    Qin, Bing
    KNOWLEDGE-BASED SYSTEMS, 2022, 254
  • [17] On the Importance of Effectively Adapting Pretrained Language Models for Active Learning
    Margatina, Katerina
    Barrault, Loic
    Aletras, Nikolaos
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 825 - 836
  • [18] CnGeoPLM: Contextual knowledge selection and embedding with pretrained language representation model for the geoscience domain
    Kai Ma
    Shuai Zheng
    Miao Tian
    Qinjun Qiu
    Yongjian Tan
    Xinxin Hu
    HaiYan Li
    Zhong Xie
    Earth Science Informatics, 2023, 16 : 3629 - 3646
  • [19] CnGeoPLM: Contextual knowledge selection and embedding with pretrained language representation model for the geoscience domain
    Ma, Kai
    Zheng, Shuai
    Tian, Miao
    Qiu, Qinjun
    Tan, Yongjian
    Hu, Xinxin
    Li, HaiYan
    Xie, Zhong
    EARTH SCIENCE INFORMATICS, 2023, 16 (04) : 3629 - 3646
  • [20] A review of graph neural networks and pretrained language models for knowledge graph reasoning
    Ma, Jiangtao
    Liu, Bo
    Li, Kunlin
    Li, Chenliang
    Zhang, Fan
    Luo, Xiangyang
    Qiao, Yaqiong
    NEUROCOMPUTING, 2024, 609