MEGA: Meta-Graph Augmented Pre-Training Model for Knowledge Graph Completion

被引:0
|
作者
Wang, Yashen [1 ,2 ]
Ouyang, Xiaoye [1 ]
Guo, Dayu [3 ]
Zhu, Xiaoling [3 ]
机构
[1] China Acad Elect & Informat Technol CETC, Natl Engn Lab Risk Percept & Prevent RRP, Beijing 100041, Peoples R China
[2] Informat Sci Acad CETC, Key Lab Cognit & Intelligence Technol CIT, Beijing 100041, Peoples R China
[3] CETC Acad Elect & Informat Technol Grp Co Ltd, Beijing 100041, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph completion; meta-graph; pre-training model; multi-task learning; semantic enhancement; CONCEPTUALIZATION;
D O I
10.1145/3617379
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Nowadays, a large number of Knowledge Graph Completion (KGC) methods have been proposed by using embedding based manners, to overcome the incompleteness problem faced with knowledge graph (KG). One important recent innovation in Natural Language Processing (NLP) domain is the employ of deep neural models that make the most of pre-training, culminating in BERT, the most popular example of this line of approaches today. Recently, a series of new KGC methods introducing a pre-trained language model, such as KG-BERT, have been developed and released compelling performance. However, previous pre-training based KGC methods usually train the model by using simple training task and only utilize one-hop relational signals in KG, which leads that they cannot model high-order semantic contexts and multi-hop complex relatedness. To overcome this problem, this article presents a novel pre-training framework for KGC task, which especially consists of both one-hop relation level task (low-order) and multi-hop meta-graph level task (high-order). Hence, the proposed method can capture not only the elaborate sub-graph structure but also the subtle semantic information on the given KG. The empirical results show the efficiency of the proposed method on the widely used real-world datasets.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Graph Structure Enhanced Pre-Training Language Model for Knowledge Graph Completion
    Zhu, Huashi
    Xu, Dexuan
    Huang, Yu
    Jin, Zhi
    Ding, Weiping
    Tong, Jiahui
    Chong, Guoshuang
    [J]. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (04): : 2697 - 2708
  • [2] Pre-training of Graph Augmented Transformers for Medication Recommendation
    Shang, Junyuan
    Ma, Tengfei
    Xiao, Cao
    Sun, Jimeng
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5953 - 5959
  • [3] Contrastive Language-knowledge Graph Pre-training
    Yuan, Xiaowei
    Liu, Kang
    Wang, Yequan
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (04)
  • [4] JAKET: Joint Pre-training of Knowledge Graph and Language Understanding
    Yu, Donghan
    Zhu, Chenguang
    Yang, Yiming
    Zeng, Michael
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11630 - 11638
  • [5] Meta-path structured graph pre-training for improving knowledge tracing in intelligent tutoring
    Zhu, Menglin
    Qiu, Liqing
    Zhou, Jingcheng
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 254
  • [6] A type-augmented knowledge graph embedding framework for knowledge graph completion
    He, Peng
    Zhou, Gang
    Yao, Yao
    Wang, Zhe
    Yang, Hao
    [J]. SCIENTIFIC REPORTS, 2023, 13 (01):
  • [7] A type-augmented knowledge graph embedding framework for knowledge graph completion
    Peng He
    Gang Zhou
    Yao Yao
    Zhe Wang
    Hao Yang
    [J]. Scientific Reports, 13 (1)
  • [8] Improving Knowledge Graph Representation Learning by Structure Contextual Pre-training
    Ye, Ganqiang
    Zhang, Wen
    Bi, Zhen
    Wong, Chi Man
    Chen, Hui
    Chen, Huajun
    [J]. PROCEEDINGS OF THE 10TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE GRAPHS (IJCKG 2021), 2021, : 151 - 155
  • [9] Knowledge Graph Based Synthetic Corpus Generation for Knowledge-Enhanced Language Model Pre-training
    Agarwal, Oshin
    Ge, Heming
    Shakeri, Siamak
    Al-Rfou, Rami
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 3554 - 3565
  • [10] Attributed Heterogeneous Graph Embedding with Meta-graph Attention
    Ouyang, Xinwang
    Chen, Hongmei
    Yang, Peizhong
    Wang, Lizhen
    Xiao, Qing
    [J]. WEB AND BIG DATA, APWEB-WAIM 2024, PT III, 2024, 14963 : 129 - 144