MEGA: Meta-Graph Augmented Pre-Training Model for Knowledge Graph Completion

被引:0
|
作者
Wang, Yashen [1 ,2 ]
Ouyang, Xiaoye [1 ]
Guo, Dayu [3 ]
Zhu, Xiaoling [3 ]
机构
[1] China Acad Elect & Informat Technol CETC, Natl Engn Lab Risk Percept & Prevent RRP, Beijing 100041, Peoples R China
[2] Informat Sci Acad CETC, Key Lab Cognit & Intelligence Technol CIT, Beijing 100041, Peoples R China
[3] CETC Acad Elect & Informat Technol Grp Co Ltd, Beijing 100041, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph completion; meta-graph; pre-training model; multi-task learning; semantic enhancement; CONCEPTUALIZATION;
D O I
10.1145/3617379
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Nowadays, a large number of Knowledge Graph Completion (KGC) methods have been proposed by using embedding based manners, to overcome the incompleteness problem faced with knowledge graph (KG). One important recent innovation in Natural Language Processing (NLP) domain is the employ of deep neural models that make the most of pre-training, culminating in BERT, the most popular example of this line of approaches today. Recently, a series of new KGC methods introducing a pre-trained language model, such as KG-BERT, have been developed and released compelling performance. However, previous pre-training based KGC methods usually train the model by using simple training task and only utilize one-hop relational signals in KG, which leads that they cannot model high-order semantic contexts and multi-hop complex relatedness. To overcome this problem, this article presents a novel pre-training framework for KGC task, which especially consists of both one-hop relation level task (low-order) and multi-hop meta-graph level task (high-order). Hence, the proposed method can capture not only the elaborate sub-graph structure but also the subtle semantic information on the given KG. The empirical results show the efficiency of the proposed method on the widely used real-world datasets.
引用
收藏
页数:24
相关论文
共 50 条
  • [21] Graph Pattern Entity Ranking Model for Knowledge Graph Completion
    Ebisu, Takuma
    Ichise, Ryutaro
    [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 988 - 997
  • [22] Pre-training and diagnosing knowledge base completion models
    Kocijan, Vid
    Jang, Myeongjun
    Lukasiewicz, Thomas
    [J]. ARTIFICIAL INTELLIGENCE, 2024, 329
  • [23] Pre-training on Large-Scale Heterogeneous Graph
    Jiang, Xunqiang
    Jia, Tianrui
    Fang, Yuan
    Shi, Chuan
    Lin, Zhe
    Wang, Hui
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 756 - 766
  • [24] Graph Neural Pre-training for Recommendation with Side Information
    Liu, Siwei
    Meng, Zaiqiao
    Macdonald, Craig
    Ounis, Iadh
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (03)
  • [25] KPGT: Knowledge-Guided Pre-training of Graph Transformer for Molecular Property Prediction
    Li, Han
    Zhao, Dan
    Zeng, Jianyang
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 857 - 867
  • [26] Graph Contrastive Multi-view Learning: A Pre-training Framework for Graph Classification
    Adjeisah, Michael
    Zhu, Xinzhong
    Xu, Huiying
    Ayall, Tewodros Alemu
    [J]. Knowledge-Based Systems, 2024, 299
  • [27] Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion
    Xu, Wenjie
    Liu, Ben
    Peng, Miao
    Jia, Xu
    Peng, Min
    [J]. arXiv, 2023,
  • [28] An Adaptive Graph Pre-training Framework for Localized Collaborative Filtering
    Wang, Yiqi
    Li, Chaozhuo
    Liu, Zheng
    Li, Mingzheng
    Tang, Jiliang
    Xie, Xing
    Chen, Lei
    Yu, Philip S.
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (02)
  • [29] Unsupervised pre-training of graph transformers on patient population graphs
    Pellegrini, Chantal
    Navab, Nassir
    Kazi, Anees
    [J]. MEDICAL IMAGE ANALYSIS, 2023, 89
  • [30] Dynamic Scene Graph Generation via Anticipatory Pre-training
    Li, Yiming
    Yang, Xiaoshan
    Xu, Changsheng
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 13864 - 13873