ALSTM: An attention-based long short-term memory framework for knowledge base reasoning

被引:19
|
作者
Wang, Qi [1 ]
Hao, Yongsheng [2 ]
机构
[1] Fudan Univ, Sch Comp Sci, Software Engn, Shanghai, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Network Ctr, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge base; LSTM; Attention; Memory; Logical rule; Deep learning; DEEP NEURAL-NETWORKS; GAME; GO;
D O I
10.1016/j.neucom.2020.02.065
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge Graphs (KGs) have been applied to various application scenarios including Web searching, Q&A, recommendation system, natural language processing and so on. However, the vast majority of Knowledge Bases (KBs) are incomplete, necessitating a demand for KB completion (KBC). Methods of KBC used in the mainstream current knowledge base include the latent factor model, the random walk model and recent popular methods based on reinforcement learning, which performs well in their respective areas of expertise. Recurrent neural network (RNN) and its variants model temporal data by remembering information for long periods, however, whether they also have the ability to use the information they have already remembered to achieve complex reasoning in the knowledge graph. In this paper, we produce a novel framework (ALSTM) based on the Attention mechanism and Long Short-Term Memory (LSTM), which associates structure learning with parameter learning of first-order logical rules in an end-to-end differentiable neural networks model. In this framework, we designed a memory system and employed a multi-head dot product attention (MHDPA) to interact and update the memories embedded in the memory system for reasoning purposes. This is also consistent with the process of human cognition and reasoning, looking for enlightenment for the future in historical memory. In addition, we explored the use of inductive bias in deep learning to facilitate learning of entities, relations, and rules. Experiments establish the efficiency and effectiveness of our model and show that our method achieves better performance in tasks which include fact prediction and link prediction than baseline models on several benchmark datasets such as WN18RR, FB15K-237 and NELL-995. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:342 / 351
页数:10
相关论文
共 50 条
  • [41] A forecast model of short-term wind speed based on the attention mechanism and long short-term memory
    Wang Xing
    Wu Qi-liang
    Tan Gui-rong
    Qian Dai-li
    Zhou Ke
    Multimedia Tools and Applications, 2024, 83 : 45603 - 45623
  • [42] Research on Attention Classification Based on Long Short-term Memory Network
    Wang Pai
    Wu Fan
    Wang Mei
    Qin Xue-Bin
    2020 5TH INTERNATIONAL CONFERENCE ON MECHANICAL, CONTROL AND COMPUTER ENGINEERING (ICMCCE 2020), 2020, : 1148 - 1151
  • [43] Automatic Lip-Reading System Based on Deep Convolutional Neural Network and Attention-Based Long Short-Term Memory
    Lu, Yuanyao
    Li, Hongbo
    APPLIED SCIENCES-BASEL, 2019, 9 (08):
  • [44] Attention-based Aspect Reasoning for Knowledge Base Question Answering on Clinical Notes
    Wang, Ping
    Shi, Tian
    Agarwal, Khushbu
    Choudhury, Sutanay
    Reddy, Chandan K.
    13TH ACM INTERNATIONAL CONFERENCE ON BIOINFORMATICS, COMPUTATIONAL BIOLOGY AND HEALTH INFORMATICS, BCB 2022, 2022,
  • [45] An attention-based bidirectional long short-term memory based optimal deep learning technique for bone cancer detection and classifications
    Vaiyapuri, Thavavel
    Balaji, Prasanalakshmi
    Shridevi, S.
    Dharmarajlu, Santhi Muttipoll
    Alaseem, Nourah Ali
    AIMS MATHEMATICS, 2024, 9 (06): : 16704 - 16720
  • [46] TA-BLSTM: Tag Attention-based Bidirectional Long Short-Term Memory for Service Recommendation in Mashup Creation
    Shi, Min
    Tang, Yufei
    Liu, Jianxun
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [47] Attention-based bidirectional long short-term memory networks for extracting temporal relationships from clinical discharge summaries
    Alfattni, Ghada
    Peek, Niels
    Nenadic, Goran
    JOURNAL OF BIOMEDICAL INFORMATICS, 2021, 123
  • [48] Classification of causes of speech recognition errors using attention-based bidirectional long short-term memory and modulation spectrum
    Santoso, Jennifer
    Yamada, Takeshi
    Makino, Shoji
    2019 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2019, : 302 - 306
  • [49] Attention-Based Long Short-Term Memory Recurrent Neural Network for Capacity Degradation of Lithium-Ion Batteries
    Mamo, Tadele
    Wang, Fu-Kwun
    BATTERIES-BASEL, 2021, 7 (04):
  • [50] Deep learning for quality prediction of nonlinear dynamic processes with variable attention-based long short-term memory network
    Yuan, Xiaofeng
    Li, Lin
    Wang, Yalin
    Yang, Chunhua
    Gui, Weihua
    CANADIAN JOURNAL OF CHEMICAL ENGINEERING, 2020, 98 (06): : 1377 - 1389