Pretrain-KGE: Learning Knowledge Representation from Pretrained Language Models

被引:0
|
作者
Zhang, Zhiyuan [1 ]
Liu, Xiaoqian [1 ,2 ]
Zhang, Yi [1 ]
Su, Qi [1 ,2 ]
Sun, Xu [1 ]
He, Bin [3 ]
机构
[1] Peking Univ, Sch EECS, MOE Key Lab Computat Linguist, Beijing, Peoples R China
[2] Peking Univ, Sch Foreign Languages, Beijing, Peoples R China
[3] Huawei Noahs Ark Lab, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conventional knowledge graph embedding (KGE) often suffers from limited knowledge representation, leading to performance degradation especially on the low-resource problem. To remedy this, we propose to enrich knowledge representation via pretrained language models by leveraging world knowledge from pretrained models. Specifically, we present a universal training framework named Pretrain-KGE consisting of three phases: semantic-based fine-tuning phase, knowledge extracting phase and KGE training phase. Extensive experiments show that our proposed Pretrain-KGE can improve results over KGE models, especially on solving the low-resource problem.
引用
收藏
页码:259 / 266
页数:8
相关论文
共 50 条
  • [1] Knowledge Graphs and Pretrained Language Models Enhanced Representation Learning for Conversational Recommender Systems
    Qiu, Zhangchi
    Tao, Ye
    Pan, Shirui
    Liew, Alan Wee-Chung
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [2] Multilingual Knowledge Graph Completion from Pretrained Language Models with Knowledge Constraints
    Song, Ran
    He, Shizhu
    Gao, Shengxiang
    Cai, Li
    Liu, Kang
    Yu, Zhengtao
    Zhao, Jun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 7709 - 7721
  • [3] Ensemble pretrained language models to extract biomedical knowledge from literature
    Li, Zhao
    Wei, Qiang
    Huang, Liang-Chin
    Li, Jianfu
    Hu, Yan
    Chuang, Yao-Shun
    He, Jianping
    Das, Avisha
    Keloth, Vipina Kuttichi
    Yang, Yuntao
    Diala, Chiamaka S.
    Roberts, Kirk E.
    Tao, Cui
    Jiang, Xiaoqian
    Zheng, W. Jim
    Xu, Hua
    JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 2024, 31 (09) : 1904 - 1911
  • [4] Eliciting Knowledge from Pretrained Language Models for Prototypical Prompt Verbalizer
    Wei, Yinyi
    Mo, Tong
    Jiang, Yongtao
    Li, Weiping
    Zhao, Wen
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT II, 2022, 13530 : 222 - 233
  • [5] ReGen: Reinforcement Learning for Text and Knowledge Base Generation using Pretrained Language Models
    Dognin, Pierre L.
    Padhi, Inkit
    Melnyk, Igor
    Das, Payel
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1084 - 1099
  • [6] An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
    Chronopoulou, Alexandra
    Baziotis, Christos
    Potamianos, Alexandros
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 2089 - 2095
  • [7] BertNet: Harvesting Knowledge Graphs with Arbitrary Relations from Pretrained Language Models
    Hao, Shibo
    Tan, Bowen
    Tang, Kaiwen
    Ni, Bin
    Shao, Xiyan
    Zhang, Hengzhe
    Xing, Eric P.
    Hu, Zhiting
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5000 - 5015
  • [8] Practical Takes on Federated Learning with Pretrained Language Models
    Agarwal, Ankur
    Rezagholizadeh, Mehdi
    Parthasarathi, Prasanna
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 454 - 471
  • [9] Multilingual LAMA: Investigating Knowledge in Multilingual Pretrained Language Models
    Kassner, Nora
    Dufter, Philipp
    Schutze, Hinrich
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3250 - 3258
  • [10] Constructing Taxonomies from Pretrained Language Models
    Chen, Catherine
    Lin, Kevin
    Klein, Dan
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 4687 - 4700