Few-Shot Knowledge Graph Completion Model Based on Relation Learning

被引:0
|
作者
Li, Weijun [1 ,2 ]
Gu, Jianlai [2 ]
Li, Ang [2 ]
Gao, Yuxiao [2 ]
Zhang, Xinyong [2 ]
机构
[1] North Minzu Univ, Key Lab Images & Grap Intelligent Proc, State Ethn Affairs Commiss, Yinchuan 750021, Peoples R China
[2] North Minzu Univ, Sch Comp Sci & Engn, Yinchuan 750021, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 17期
基金
中国国家自然科学基金;
关键词
knowledge graph; complete the knowledge graph; few-shot relation; neighborhood aggregation; link prediction;
D O I
10.3390/app13179513
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Considering the complexity of entity pair relations and the information contained in the target neighborhood in few-shot knowledge graphs (KG), existing few-shot KG completion methods generally suffer from insufficient relation representation learning capabilities and neglecting the contextual semantics of entities. To tackle the above problems, we propose a Few-shot Relation Learning-based Knowledge Graph Completion model (FRL-KGC). First, a gating mechanism is introduced during the aggregation of higher-order neighborhoods of entities in formation, enriching the central entity representation while reducing the adverse effects of noisy neighbors. Second, during the relation representation learning stage, a more accurate relation representation is learned by using the correlation between entity pairs in the reference set. Finally, an LSTM structure is incorporated into the Transformer learner to enhance its ability to learn the contextual semantics of entities and relations and predict new factual knowledge. We conducted comparative experiments on the publicly available NELL-One and Wiki-One datasets, comparing FRL-KGC with six few-shot knowledge graph completion models and five traditional knowledge graph completion models for five-shot link prediction. The results showed that FRL-KGC outperformed all comparison models in terms of MRR, Hits@10, Hits@5, and Hits@1 metrics.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Incorporating Prior Type Information for Few-Shot Knowledge Graph Completion
    Yao, Siyu
    Zhao, Tianzhe
    Xu, Fangzhi
    Liu, Jun
    WEB AND BIG DATA, PT II, APWEB-WAIM 2022, 2023, 13422 : 271 - 285
  • [32] Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph Completion
    Luo, Linhao
    Li, Yuan-Fang
    Haffari, Gholamreza
    Pan, Shirui
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 900 - 910
  • [34] Relational Learning with Hierarchical Attention Encoder and Recoding Validator for Few-Shot Knowledge Graph Completion
    Yuan, Xu
    Xu, Chengchuan
    Li, Peng
    Chen, Zhikui
    37TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, 2022, : 786 - 794
  • [35] Prototype Completion for Few-Shot Learning
    Zhang, Baoquan
    Li, Xutao
    Ye, Yunming
    Feng, Shanshan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12250 - 12268
  • [36] Dealing with Over-Reliance on Background Graph for Few-Shot Knowledge Graph Completion
    Yang, Ruiyin
    Wei, Xiao
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2023, 2023, 14117 : 263 - 275
  • [37] BayesKGR: Bayesian Few-Shot Learning for Knowledge Graph Reasoning
    Zhao, Feng
    Yan, Cheng
    Jin, Hai
    He, Lifang
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (06)
  • [38] Combat data shift in few-shot learning with knowledge graph
    Zhu, Yongchun
    Zhuang, Fuzhen
    Zhang, Xiangliang
    Qi, Zhiyuan
    Shi, Zhiping
    Cao, Juan
    He, Qing
    FRONTIERS OF COMPUTER SCIENCE, 2023, 17 (01)
  • [39] Combat data shift in few-shot learning with knowledge graph
    Yongchun Zhu
    Fuzhen Zhuang
    Xiangliang Zhang
    Zhiyuan Qi
    Zhiping Shi
    Juan Cao
    Qing He
    Frontiers of Computer Science, 2023, 17
  • [40] Few-Shot Knowledge Graph Completion Combined with Type-Aware Attention
    Pu X.
    Wang H.
    Xian Y.
    Data Analysis and Knowledge Discovery, 2023, 7 (09) : 51 - 63