Learning attention-based representations from multiple patterns for relation prediction in knowledge graphs

被引:2
|
作者
Lourenco, Vitor [1 ]
Paes, Aline [1 ]
机构
[1] Univ Fed Fluminense, Inst Comp, Ave Gal Milton Tavares Souza,S-N,Boa Viagem, Niteroi, RJ, Brazil
关键词
Knowledge graphs; Representation learning; Embeddings; Attention mechanism;
D O I
10.1016/j.knosys.2022.109232
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge bases, and their representations in the form of knowledge graphs (KGs), are naturally incomplete. Since scientific and industrial applications have extensively adopted them, there is a high demand for solutions that complete their information. Several recent works tackle this challenge by learning embeddings for entities and relations, then employing them to predict new relations among the entities. Despite their aggrandizement, most of those methods focus only on the local neighbors of a relation to learn the embeddings. As a result, they may fail to capture the KGs' context information by neglecting long-term dependencies and the propagation of entities' semantics. In this manuscript, we propose /EMP (Attention-based Embeddings from Multiple Patterns), a novel model for learning contextualized representations by: (i) acquiring entities context information through an attention-enhanced message-passing scheme, which captures the entities local semantics while focusing on different aspects of their neighborhood; and (ii) capturing the semantic context, by leveraging the paths and their relationships between entities. Our empirical findings draw insights into how attention mechanisms can improve entities' context representation and how combining entities and semantic path contexts improves the general representation of entities and the relation predictions. Experimental results on several large and small knowledge graph benchmarks show that /EMP either outperforms or competes with state-of-the-art relation prediction methods. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs
    Nathani, Deepak
    Chauhan, Jatin
    Sharma, Charu
    Kaul, Manohar
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 4710 - 4723
  • [2] Attention-based Learning for Multiple Relation Patterns in Knowledge Graph Embedding
    Song, Tengwei
    Luo, Jie
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2022, 13368 : 658 - 670
  • [3] Learning hyperbolic attention-based embeddings for link prediction in knowledge graphs
    Zeb, Adnan
    Ul Haq, Anwar
    Chen, Junde
    Lei, Zhenfeng
    Zhang, Defu
    [J]. KNOWLEDGE-BASED SYSTEMS, 2021, 229
  • [4] An Attention-Based Approach to Rule Learning in Large Knowledge Graphs
    Li, Minghui
    Wang, Kewen
    Wang, Zhe
    Wu, Hong
    Feng, Zhiyong
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS: DASFAA 2021 INTERNATIONAL WORKSHOPS, 2021, 12680 : 154 - 165
  • [5] Attention-Based Relation Prediction of Knowledge Graph by Incorporating Graph and Context Features
    Zhong, Shanna
    Yue, Kun
    Duan, Liang
    [J]. WEB INFORMATION SYSTEMS ENGINEERING - WISE 2022, 2022, 13724 : 259 - 273
  • [6] An attention-based representation learning model for multiple relational knowledge graph
    Han, Zhongming
    Chen, Fuyu
    Zhang, Hui
    Yang, Zhiyu
    Liu, Wenwen
    Shen, Zequan
    Xiong, Haitao
    [J]. EXPERT SYSTEMS, 2023, 40 (06)
  • [7] Learning from Interpretable Analysis: Attention-Based Knowledge Tracing
    Zhu, Jia
    Yu, Weihao
    Zheng, Zetao
    Huang, Changqin
    Tang, Yong
    Fung, Gabriel Pui Cheong
    [J]. ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2020), PT II, 2020, 12164 : 364 - 368
  • [8] Attention-based Deep Multiple Instance Learning
    Ilse, Maximilian
    Tomczak, Jakub M.
    Welling, Max
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [9] Pearson Autocovariance Distinct Patterns and Attention-Based Deep Learning for Wind Power Prediction
    Jency, W. G.
    Judith, J. E.
    [J]. JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2022, 2022
  • [10] Attention-based learning
    Kasderidis, S
    Taylor, JG
    [J]. 2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 525 - 530