Sequence Embedding for Zero or Low Resource Knowledge Graph Completion

被引:0
|
作者
Du, Zhijuan [1 ,2 ]
机构
[1] Inner Mongolia Univ, Hohhot 010021, Peoples R China
[2] Inner Mongolia Discipline Inspect & Supervis Big, Hohhot 010015, Peoples R China
关键词
Knowledge graph; Zero/low resource; Structure sequence; Multi head attention; Non-parameter; Adversarial learning;
D O I
10.1007/978-3-030-73194-6_20
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge graph completion (KGC) has been proposed to improve KGs by filling in missing links. Previous KGC approaches require a large number of training instances (entity and relation) and hold a closed-world assumption. The real case is that very few instances are available and KG evolve quickly with new entities and relations being added by the minute. The newly added cases are zero resource in training. In this work, we propose a Sequence Embedding with Adversarial learning approach (SEwA) for zero or low resource KGC. It transform the KGC into a sequence prediction problem by making full use of inherently link structure of knowledge graph and resource-easy-to-transfer feature of adversarial contextual embedding. Specifically, the triples ( <h, r, t>) and higher-order triples ( <h, p, t>) containing the paths (p = r(1) -> ... -> r(n) ) are represented as word sequences and are encoded by pre-training model with multi head self-attention. The path is obtained by a non-parametric learning based on the one-class classification of the relation trees. The zero and low resources issues are further optimizes by adversarial learning. At last, our SEwA is evaluated by low resource datasets and open world datasets.
引用
收藏
页码:290 / 306
页数:17
相关论文
共 50 条
  • [32] Learning Sequence Encoders for Temporal Knowledge Graph Completion
    Garcia-Duran, Alberto
    Dumancic, Sebastijan
    Niepert, Mathias
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4816 - 4821
  • [33] Hyperplane-based time-aware knowledge graph embedding for temporal knowledge graph completion
    He, Peng
    Zhou, Gang
    Liu, Hongbo
    Xia, Yi
    Wang, Ling
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (06) : 5457 - 5469
  • [34] Zero-Shot Knowledge Graph Completion for Recommendation System
    Wang, Zhiyuan
    Chen, Cheng
    Tang, Ke
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2022, 2022, 13756 : 188 - 198
  • [35] A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion
    Shang, Bin
    Zhao, Yinliang
    Liu, Jun
    Liu, Yifan
    Wang, Chenxin
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (20): : 15005 - 15018
  • [36] Knowledge graph embedding and completion based on entity community and local importance
    Xu-Hua Yang
    Gang-Feng Ma
    Xin Jin
    Hai-Xia Long
    Jie Xiao
    Lei Ye
    Applied Intelligence, 2023, 53 : 22132 - 22142
  • [37] A deep embedding model for knowledge graph completion based on attention mechanism
    Jin Huang
    TingHua Zhang
    Jia Zhu
    Weihao Yu
    Yong Tang
    Yang He
    Neural Computing and Applications, 2021, 33 : 9751 - 9760
  • [38] Aggregation or separation? Adaptive embedding message passing for knowledge graph completion
    Li, Zhifei
    Chen, Lifan
    Jian, Yue
    Wang, Han
    Zhao, Yue
    Zhang, Miao
    Xiao, Kui
    Zhang, Yan
    Deng, Honglian
    Hou, Xiaoju
    Information Sciences, 2025, 691
  • [39] A deep embedding model for knowledge graph completion based on attention mechanism
    Huang, Jin
    Zhang, TingHua
    Zhu, Jia
    Yu, Weihao
    Tang, Yong
    He, Yang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15): : 9751 - 9760
  • [40] Relation domain and range completion method based on knowledge graph embedding
    Lei J.-P.
    Ouyang D.-T.
    Zhang L.-M.
    Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2022, 52 (01): : 154 - 161