Learning Context-based Embeddings for Knowledge Graph Completion

被引:0
|
作者
Fei Pu
Zhongwei Zhang
Yan Feng
Bailin Yang
机构
[1] SchoolofComputerandInformationEngineeringZhejiangGongshangUniversity
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Purpose: Due to the incompleteness nature of knowledge graphs(KGs), the task of predicting missing links between entities becomes important. Many previous approaches are static, this posed a notable problem that all meanings of a polysemous entity share one embedding vector. This study aims to propose a polysemous embedding approach, named KG embedding under relational contexts(Cont E for short), for missing link prediction. Design/methodology/approach: Cont E models and infers different relationship patterns by considering the context of the relationship, which is implicit in the local neighborhood of the relationship. The forward and backward impacts of the relationship in Cont E are mapped to two different embedding vectors, which represent the contextual information of the relationship. Then, according to the position of the entity, the entity's polysemous representation is obtained by adding its static embedding vector to the corresponding context vector of the relationship. Findings: Cont E is a fully expressive, that is, given any ground truth over the triples, there are embedding assignments to entities and relations that can precisely separate the true triples from false ones. Cont E is capable of modeling four connectivity patterns such as symmetry, antisymmetry, inversion and composition. Research limitations: Cont E needs to do a grid search to find best parameters to get best performance in practice, which is a time-consuming task. Sometimes, it requires longer entity vectors to get better performance than some other models.Practical implications: Cont E is a bilinear model, which is a quite simple model that could be applied to large-scale KGs. By considering contexts of relations, Cont E can distinguish the exact meaning of an entity in different triples so that when performing compositional reasoning, it is capable to infer the connectivity patterns of relations and achieves good performance on link prediction tasks.Originality/value: Cont E considers the contexts of entities in terms of their positions in triples and the relationships they link to. It decomposes a relation vector into two vectors, namely, forward impact vector and backward impact vector in order to capture the relational contexts. Cont E has the same low computational complexity as Trans E. Therefore, it provides a new approach for contextualized knowledge graph embedding.
引用
收藏
页码:84 / 106
页数:23
相关论文
共 50 条
  • [1] Learning Context-based Embeddings for Knowledge Graph Completion
    Fei Pu
    Zhongwei Zhang
    Yan Feng
    Bailin Yang
    Journal of Data and Information Science, 2022, (02) : 84 - 106
  • [2] Learning Context-based Embeddings for Knowledge Graph Completion
    Pu, Fei
    Zhang, Zhongwei
    Feng, Yan
    Yang, Bailin
    JOURNAL OF DATA AND INFORMATION SCIENCE, 2022, 7 (02) : 84 - 106
  • [3] Learning Entity Type Embeddings for Knowledge Graph Completion
    Moon, Changsung
    Jones, Paul
    Samatova, Nagiza F.
    CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 2215 - 2218
  • [4] Learning Entity and Relation Embeddings for Knowledge Graph Completion
    Lin, Yankai
    Liu, Zhiyuan
    Sun, Maosong
    Liu, Yang
    Zhu, Xuan
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2181 - 2187
  • [5] Medical Knowledge Graph Completion Based on Word Embeddings
    Gao, Mingxia
    Lu, Jianguo
    Chen, Furong
    INFORMATION, 2022, 13 (04)
  • [6] Scalable Learning of Entity and Predicate Embeddings for Knowledge Graph Completion
    Minervini, Pasquale
    Fanizzi, Nicola
    d'Amato, Claudia
    Esposito, Floriana
    2015 IEEE 14TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2015, : 162 - 167
  • [7] Triple Context-Based Knowledge Graph Embedding
    Gao, Huan
    Shi, Jun
    Qi, Guilin
    Wang, Meng
    IEEE ACCESS, 2018, 6 : 58978 - 58989
  • [8] Translation-Based Embeddings with Octonion for Knowledge Graph Completion
    Yu, Mei
    Bai, Chen
    Yu, Jian
    Zhao, Mankun
    Xu, Tianyi
    Liu, Hongwei
    Li, Xuewei
    Yu, Ruiguo
    APPLIED SCIENCES-BASEL, 2022, 12 (08):
  • [9] Hyperbolic Knowledge Graph Embeddings for Knowledge Base Completion
    Kolyvakis, Prodromos
    Kalousis, Alexandros
    Kiritsis, Dimitris
    SEMANTIC WEB (ESWC 2020), 2020, 12123 : 199 - 214
  • [10] Context-based surface completion
    Sharf, A
    Alexa, M
    Cohen-Or, D
    ACM TRANSACTIONS ON GRAPHICS, 2004, 23 (03): : 878 - 887