Knowledge Graph Completion via Complete Attention between Knowledge Graph and Entity Descriptions

被引:2
|
作者
Zhao, Minjun [1 ]
Zhao, Yawei [1 ]
Xu, Bing [2 ]
机构
[1] Univ Chinese Acad Sci, Big Data Anal Technol Lab, Beijing, Peoples R China
[2] Knowlegene Data Technol Co Ltd, AI Lab Beijing, Beijing, Peoples R China
关键词
Complete Attention; Link Prediction; Entity Descriptions; Deep Learning;
D O I
10.1145/3331453.3362056
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The objective of learning representation of knowledge graph is assumed to encode both entities and relations into a continuous low-dimensional vector space. Previous methods usually represent the same entity in different triples with the same representation. Considering the fact that different entities should have different semantics in different triples, this paper proposes a method to learn entity description information based on the triple, under the complete attention (CATT) mechanism, in the knowledge graphs. By doing so, the entity has different representations of corresponding semantics in different triples. For encoding the information of entity description, we use three deep learning methods including CNN, Bi-LSTM and Transformer. Experimental results show that, with the proposed method, the performance on both entity prediction and relationship prediction improved significantly.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Augmenting Embedding Projection With Entity Descriptions for Knowledge Graph Completion
    Chen, Junfan
    Xu, Jie
    Bo, Manhui
    Tang, Hongwu
    [J]. IEEE ACCESS, 2021, 9 : 159955 - 159964
  • [2] Knowledge Graph Completion Based on Entity Descriptions in Hyperbolic Space
    Zhang, Xiaoming
    Tian, Dongjie
    Wang, Huiyong
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (01):
  • [3] Graph Pattern Entity Ranking Model for Knowledge Graph Completion
    Ebisu, Takuma
    Ichise, Ryutaro
    [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 988 - 997
  • [4] A Contextualized Entity Representation for Knowledge Graph Completion
    Pu, Fei
    Yang, Bailin
    Ying, Jianchao
    You, Lizhou
    Xu, Chenou
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I, 2020, 12274 : 77 - 85
  • [5] Hierarchical Perceptual Graph Attention Network for Knowledge Graph Completion
    Han, Wenhao
    Liu, Xuemei
    Zhang, Jianhao
    Li, Hairui
    [J]. ELECTRONICS, 2024, 13 (04)
  • [6] Graph Attention Mechanism with Cardinality Preservation for Knowledge Graph Completion
    Ding, Cong
    Wei, Xiao
    Chen, Yongqi
    Zhao, Rui
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 479 - 490
  • [7] Knowledge graph completion based on graph contrastive attention network
    Liu, Danyang
    Fang, Quan
    Zhang, Xiaowei
    Hu, Jun
    Qian, Shengsheng
    Xu, Changsheng
    [J]. Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2022, 48 (08): : 1428 - 1435
  • [8] Hyperbolic hierarchical graph attention network for knowledge graph completion
    许浩
    CHEN Shudong
    QI Donglin
    TONG Da
    YU Yong
    CHEN Shuai
    [J]. HighTechnologyLetters., 2024, 30 (03) - 279
  • [9] Learning Entity Type Embeddings for Knowledge Graph Completion
    Moon, Changsung
    Jones, Paul
    Samatova, Nagiza F.
    [J]. CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 2215 - 2218
  • [10] Learning Entity and Relation Embeddings for Knowledge Graph Completion
    Lin, Yankai
    Liu, Zhiyuan
    Sun, Maosong
    Liu, Yang
    Zhu, Xuan
    [J]. PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2181 - 2187