Relational Graph Neural Network with Hierarchical Attention for Knowledge Graph Completion

被引:0
|
作者
Zhang, Zhao [1 ,2 ]
Zhuang, Fuzhen [1 ,2 ]
Zhu, Hengshu [3 ]
Shi, Zhiping [4 ]
Xiong, Hui [3 ,5 ]
He, Qing [1 ,2 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, CAS, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[3] Baidu Inc, Baidu Talent Intelligence Ctr, Beijing, Peoples R China
[4] Capital Normal Univ, Beijing 100048, Peoples R China
[5] Baidu Inc, Business Intelligence Lab, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The rapid proliferation of knowledge graphs (KGs) has changed the paradigm for various AI-related applications. Despite their large sizes, modern KGs are far from complete and comprehensive. This has motivated the research in knowledge graph completion (KGC), which aims to infer missing values in incomplete knowledge triples. However, most existing KGC models treat the triples in KGs independently without leveraging the inherent and valuable information from the local neighborhood surrounding an entity. To this end, we propose a Relational Graph neural network with Hierarchical ATtention (RGHAT) for the KGC task. The proposed model is equipped with a two-level attention mechanism: (i) the first level is the relation-level attention, which is inspired by the intuition that different relations have different weights for indicating an entity; (ii) the second level is the entity-level attention, which enables our model to highlight the importance of different neighboring entities under the same relation. The hierarchical attention mechanism makes our model more effective to utilize the neighborhood information of an entity. Finally, we extensively validate the superiority of RGHAT against various state-of-the-art baselines.
引用
收藏
页码:9612 / 9619
页数:8
相关论文
共 50 条
  • [31] A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion
    Shang, Bin
    Zhao, Yinliang
    Liu, Jun
    Liu, Yifan
    Wang, Chenxin
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (20): : 15005 - 15018
  • [32] A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion
    Bin Shang
    Yinliang Zhao
    Jun Liu
    Yifan Liu
    Chenxin Wang
    [J]. Neural Computing and Applications, 2023, 35 : 15005 - 15018
  • [33] A structure distinguishable graph attention network for knowledge base completion
    Xue Zhou
    Bei Hui
    Lizong Zhang
    Kexi Ji
    [J]. Neural Computing and Applications, 2021, 33 : 16005 - 16017
  • [34] Disentangled Hierarchical Attention Graph Neural Network for Recommendation
    He, Weijie
    Ouyang, Yuanxin
    Peng, Keqin
    Rong, Wenge
    Xiong, Zhang
    [J]. ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT I, ICIC 2024, 2024, 14875 : 415 - 426
  • [35] A structure distinguishable graph attention network for knowledge base completion
    Zhou, Xue
    Hui, Bei
    Zhang, Lizong
    Ji, Kexi
    [J]. NEURAL COMPUTING & APPLICATIONS, 2021, 33 (23): : 16005 - 16017
  • [36] Commonsense Knowledge Base Completion with Relational Graph Attention Network and Pre-trained Language Model
    Ju, Jinghao
    Yang, Deqing
    Liu, Jingping
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4104 - 4108
  • [37] Relational Message Passing for Knowledge Graph Completion
    Wang, Hongwei
    Ren, Hongyu
    Leskovec, Jure
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 1697 - 1707
  • [38] Graph Attention Mechanism with Cardinality Preservation for Knowledge Graph Completion
    Ding, Cong
    Wei, Xiao
    Chen, Yongqi
    Zhao, Rui
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 479 - 490
  • [40] Semantic- and relation-based graph neural network for knowledge graph completion
    Li, Xinlu
    Tian, Yujie
    Ji, Shengwei
    [J]. APPLIED INTELLIGENCE, 2024, 54 (08) : 6085 - 6107