Talking-heads attention-based knowledge representation for link prediction

被引:0
|
作者
Wang, Shirui [1 ]
Zhou, Wen'an [1 ]
Zhou, Qiang [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Beijing 100876, Peoples R China
来源
关键词
Knowledge representation; Link prediction; Talking-heads attention;
D O I
10.1016/j.csl.2021.101340
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
State-of-the-art methods for link prediction, also known as knowledge graph embedding, aim to represent both entities and relations in the given knowledge graphs (KGs) into a continuous low-dimensional vector space and thus could be used to fill the missing facts or identify the spurious facts in KGs, where a fact is represented as a triple in the form of (head entity, relation, tail entity). Most previous attempts solely learn triples independently and thus fail to utilize the rich hidden inference and semantic information in the local neighbourhood existed surrounding each triple in KGs. To this effect, this paper proposes a talking-heads attention-based knowledge representation method, a novel graph attention networks-based method for link prediction which learns the knowledge graph embedding with talking-heads attention guidance from multi hop neighbourhood triples. We evaluate our model in Freebase, WordNet and Kinship datasets on link prediction, experimental results demonstrate that the injection of talking-heads attention mechanism could better capture the semantic relationship of the neighbourhood surrounding triples and indeed achieve promising performance on link prediction.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Decentralized attention-based personalized human mobility prediction
    Fan Z.
    Song X.
    Jiang R.
    Chen Q.
    Shibasaki R.
    [J]. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2019, 3 (04)
  • [42] FusionDTA: attention-based feature polymerizer and knowledge distillation for drug-target binding affinity prediction
    Yuan, Weining
    Chen, Guanxing
    Chen, Calvin Yu-Chian
    [J]. BRIEFINGS IN BIOINFORMATICS, 2022, 23 (01)
  • [43] Survey on Representation Learning Methods of Knowledge Graph for Link Prediction
    Du X.-Y.
    Liu M.-W.
    Shen L.-W.
    Peng X.
    [J]. Ruan Jian Xue Bao/Journal of Software, 2024, 35 (01): : 87 - 117
  • [44] Knowledge graph embedding with inverse function representation for link prediction
    Zhang, Qianjin
    Xu, Yandan
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 127
  • [45] Attention-Based Knowledge Tracing with Heterogeneous Information Network Embedding
    Zhang, Nan
    Du, Ye
    Deng, Ke
    Li, Li
    Shen, Jun
    Sun, Geng
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I, 2020, 12274 : 95 - 103
  • [46] Attention-Based Direct Interaction Model for Knowledge Graph Embedding
    Zhou, Bo
    Chen, Yubo
    Liu, Kang
    Zhao, Jun
    [J]. SEMANTIC TECHNOLOGY, JIST 2019, 2020, 1157 : 100 - 108
  • [47] Image feature learning combined with attention-based spectral representation for spatio-temporal photovoltaic power prediction
    Guo, Xingchen
    Lai, Jing
    Zheng, Zhou
    Lin, Chenxiang
    Dai, Yuxing
    Xu, Xuexin
    San, Haisheng
    Jia, Rong
    Zhang, Zhihong
    [J]. IET COMPUTER VISION, 2023, 17 (07) : 777 - 794
  • [48] An Attention-Based Approach to Rule Learning in Large Knowledge Graphs
    Li, Minghui
    Wang, Kewen
    Wang, Zhe
    Wu, Hong
    Feng, Zhiyong
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS: DASFAA 2021 INTERNATIONAL WORKSHOPS, 2021, 12680 : 154 - 165
  • [49] Attention-based Feature Interaction for Efficient Online Knowledge Distillation
    Su, Tongtong
    Liang, Qiyu
    Zhang, Jinsong
    Yu, Zhaoyang
    Wang, Gang
    Liu, Xiaoguang
    [J]. 2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 579 - 588
  • [50] Bilinear Fusion of Commonsense Knowledge with Attention-Based NLI Models
    Gajbhiye, Amit
    Winterbottom, Thomas
    Al Moubayed, Noura
    Bradley, Steven
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT I, 2020, 12396 : 633 - 646