Learning Context-based Embeddings for Knowledge Graph Completion

被引:0
|
作者
Fei Pu
Zhongwei Zhang
Yan Feng
Bailin Yang
机构
[1] SchoolofComputerandInformationEngineeringZhejiangGongshangUniversity
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Purpose: Due to the incompleteness nature of knowledge graphs(KGs), the task of predicting missing links between entities becomes important. Many previous approaches are static, this posed a notable problem that all meanings of a polysemous entity share one embedding vector. This study aims to propose a polysemous embedding approach, named KG embedding under relational contexts(Cont E for short), for missing link prediction. Design/methodology/approach: Cont E models and infers different relationship patterns by considering the context of the relationship, which is implicit in the local neighborhood of the relationship. The forward and backward impacts of the relationship in Cont E are mapped to two different embedding vectors, which represent the contextual information of the relationship. Then, according to the position of the entity, the entity's polysemous representation is obtained by adding its static embedding vector to the corresponding context vector of the relationship. Findings: Cont E is a fully expressive, that is, given any ground truth over the triples, there are embedding assignments to entities and relations that can precisely separate the true triples from false ones. Cont E is capable of modeling four connectivity patterns such as symmetry, antisymmetry, inversion and composition. Research limitations: Cont E needs to do a grid search to find best parameters to get best performance in practice, which is a time-consuming task. Sometimes, it requires longer entity vectors to get better performance than some other models.Practical implications: Cont E is a bilinear model, which is a quite simple model that could be applied to large-scale KGs. By considering contexts of relations, Cont E can distinguish the exact meaning of an entity in different triples so that when performing compositional reasoning, it is capable to infer the connectivity patterns of relations and achieves good performance on link prediction tasks.Originality/value: Cont E considers the contexts of entities in terms of their positions in triples and the relationships they link to. It decomposes a relation vector into two vectors, namely, forward impact vector and backward impact vector in order to capture the relational contexts. Cont E has the same low computational complexity as Trans E. Therefore, it provides a new approach for contextualized knowledge graph embedding.
引用
收藏
页码:84 / 106
页数:23
相关论文
共 50 条
  • [31] Dihedron Algebraic Embeddings for Spatio-Temporal Knowledge Graph Completion
    Nayyeri, Mojtaba
    Vahdati, Sahar
    Khan, Md Tansen
    Alam, Mirza Mohtashim
    Wenige, Lisa
    Behrend, Andreas
    Lehmann, Jens
    SEMANTIC WEB, ESWC 2022, 2022, 13261 : 253 - 269
  • [32] Simple knowledge graph completion model based on PU learning and prompt learning
    Duan, Li
    Wang, Jing
    Luo, Bing
    Sun, Qiao
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (04) : 2683 - 2697
  • [33] Knowledge graph completion method based on hyperbolic representation learning and contrastive learning
    Zhang, Xiaodong
    Wang, Meng
    Zhong, Xiuwen
    An, Feixu
    EGYPTIAN INFORMATICS JOURNAL, 2023, 24 (04)
  • [34] Simple knowledge graph completion model based on PU learning and prompt learning
    Li Duan
    Jing Wang
    Bing Luo
    Qiao Sun
    Knowledge and Information Systems, 2024, 66 : 2683 - 2697
  • [35] Learning Embedding for Knowledge Graph Completion with Hypernetwork
    Le, Thanh
    Nguyen, Duy
    Le, Bac
    COMPUTATIONAL COLLECTIVE INTELLIGENCE (ICCCI 2021), 2021, 12876 : 16 - 28
  • [36] Inductive Learning on Commonsense Knowledge Graph Completion
    Wang, Bin
    Wang, Guangtao
    Huang, Jing
    You, Jiaxuan
    Leskovec, Jure
    Kuo, C-C Jay
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [37] GRL: Knowledge graph completion with GAN-based reinforcement learning
    Wang, Qi
    Ji, Yuede
    Hao, Yongsheng
    Cao, Jie
    KNOWLEDGE-BASED SYSTEMS, 2020, 209
  • [38] Entity-Context and Relation-Context Combined Knowledge Graph Embeddings
    Wu, Yong
    Li, Wei
    Fan, Xiaoming
    Wang, Binjun
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2022, 47 (02) : 1471 - 1482
  • [39] Entity-Context and Relation-Context Combined Knowledge Graph Embeddings
    Yong Wu
    Wei Li
    Xiaoming Fan
    Binjun Wang
    Arabian Journal for Science and Engineering, 2022, 47 : 1471 - 1482
  • [40] Iteratively Learning Embeddings and Rules for Knowledge Graph Reasoning
    Zhang, Wen
    Paudel, Bibek
    Wang, Liang
    Chen, Jiaoyan
    Zhu, Hai
    Zhang, Wei
    Bernstein, Abraham
    Chen, Huajun
    WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, : 2366 - 2377