Pretrain-KGE: Learning Knowledge Representation from Pretrained Language Models

被引:0
|
作者
Zhang, Zhiyuan [1 ]
Liu, Xiaoqian [1 ,2 ]
Zhang, Yi [1 ]
Su, Qi [1 ,2 ]
Sun, Xu [1 ]
He, Bin [3 ]
机构
[1] Peking Univ, Sch EECS, MOE Key Lab Computat Linguist, Beijing, Peoples R China
[2] Peking Univ, Sch Foreign Languages, Beijing, Peoples R China
[3] Huawei Noahs Ark Lab, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conventional knowledge graph embedding (KGE) often suffers from limited knowledge representation, leading to performance degradation especially on the low-resource problem. To remedy this, we propose to enrich knowledge representation via pretrained language models by leveraging world knowledge from pretrained models. Specifically, we present a universal training framework named Pretrain-KGE consisting of three phases: semantic-based fine-tuning phase, knowledge extracting phase and KGE training phase. Extensive experiments show that our proposed Pretrain-KGE can improve results over KGE models, especially on solving the low-resource problem.
引用
收藏
页码:259 / 266
页数:8
相关论文
共 50 条
  • [41] SentiLARE: Sentiment-Aware Language Representation Learning with Linguistic Knowledge
    Ke, Pei
    Ji, Haozhe
    Liu, Siyang
    Zhu, Xiaoyan
    Huang, Minlie
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6975 - 6988
  • [42] Representation of linguistic and domain knowledge for second language learning in virtual worlds
    Denis, Alexandre
    Falk, Ingrid
    Gardent, Claire
    Perez-Beltrachini, Laura
    LREC 2012 - EIGHTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2012, : 2631 - 2635
  • [43] Progressive Feature Adjustment for Semi-supervised Learning from Pretrained Models
    Xu, Hai-Ming
    Liu, Lingqiao
    Chen, Hao
    Abbasnejad, Ehsan
    Felix, Rafael
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3284 - 3294
  • [44] Towards Low-Resource Automatic Program Repair with Meta-Learning and Pretrained Language Models
    Wang, Weishi
    Wang, Yue
    Hoi, Steven C. H.
    Joty, Shafiq
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 6954 - 6968
  • [45] Enhancing Clinical Relevance of Pretrained Language Models Through Integration of External Knowledge: Case Study on Cardiovascular Diagnosis From Electronic Health Records
    Lu, Qiuhao
    Wen, Andrew
    Nguyen, Thien
    Liu, Hongfang
    JMIR AI, 2024, 3
  • [46] Audio-visual representation learning via knowledge distillation from speech foundation models
    Zhang, Jing-Xuan
    Wan, Genshun
    Gao, Jianqing
    Ling, Zhen-Hua
    PATTERN RECOGNITION, 2025, 162
  • [47] Leveraging Knowledge and Reinforcement Learning for Enhanced Reliability of Language Models
    Tyagi, Nancy
    Sarkar, Surjodeep
    Gaur, Manas
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 4320 - 4324
  • [48] Bridging the Gap: A Hybrid Approach to Medical Relation Extraction Using Pretrained Language Models and Traditional Machine Learning
    Hassan, Nesma A.
    Seoud, Rania A. Abul
    Salem, Dina A.
    JOURNAL OF ADVANCES IN INFORMATION TECHNOLOGY, 2024, 15 (06) : 723 - 734
  • [49] Knowledge Representation in a Visual Typed Language: from Principles to Practice
    de Saint-Cyr, Florence Dupin
    Parade, Denis
    2018 22ND INTERNATIONAL CONFERENCE INFORMATION VISUALISATION (IV), 2018, : 350 - 355
  • [50] Representation of knowledge from software requirements expressed in natural language
    Verma, Ravi Prakash
    Beg, Md. Rizwan
    2013 SIXTH INTERNATIONAL CONFERENCE ON EMERGING TRENDS IN ENGINEERING AND TECHNOLOGY (ICETET 2013), 2013, : 154 - 158