Learning Knowledge Graph Embedding with Batch Circle Loss

被引:0
|
作者
Wu, Yang [1 ]
Huang, Wenli [1 ]
Hui, Siqi [1 ]
Wang, Jinjun [1 ]
机构
[1] Xi An Jiao Tong Univ, Inst Artificial Intelligence & Robot, Xian, Peoples R China
关键词
knowledge graph embedding; link prediction; loss function;
D O I
10.1109/IJCNN55064.2022.9892213
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge Graph Embedding (KGE) is the process to learn low-dimension representations for entities and relations in knowledge graphs. It is a critical component in Knowledge Graph (KG) for link prediction and knowledge discovery. Many works focus on designing proper score function for KGE, while the study of loss function has attracted relatively less attention. In this paper, we focus on improving the loss function when learning KGE. Specifically, we find that the frequently used margin-based loss in KGE models seeks to maximize the gap between the true facts score f(p) and the false facts score f(n) and only cares about the relative order of scores. Since its optimization objective is f(p) - f(n) = m, increasing f(p) is equivalent to decreasing f(n) . Its optimization objective creates an ambiguous convergence status which impairs the separability of positive and negative facts in embedding space. Inspired by the circle loss that offers a more flexible optimization manner with definite convergence targets and is widely used in computer vision tasks, we further extend it into the KGE model with the presented Batch Circle Loss (BCL). BCL allows multiple positives to be considered per anchor (h, r) (or (r, t)) in addition to multiple negatives (as opposed to a single positive sample as used before in KGE models). By comparing with other approaches, the obtained KGE models using our proposed loss function and training method shows superior performance.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Knowledge Graph Embedding via Metagraph Learning
    Chung, Chanyoung
    Whang, Joyce Jiyoung
    [J]. SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 2212 - 2216
  • [2] Multiview Translation Learning for Knowledge Graph Embedding
    Bin, Chenzhong
    Qin, Saige
    Rao, Guanjun
    Gu, Tianlong
    Chang, Liang
    [J]. SCIENTIFIC PROGRAMMING, 2020, 2020 (2020)
  • [3] Learning Embedding for Knowledge Graph Completion with Hypernetwork
    Le, Thanh
    Nguyen, Duy
    Le, Bac
    [J]. COMPUTATIONAL COLLECTIVE INTELLIGENCE (ICCCI 2021), 2021, 12876 : 16 - 28
  • [4] Learning graph attention-aware knowledge graph embedding
    Li, Chen
    Peng, Xutan
    Niu, Yuhang
    Zhang, Shanghang
    Peng, Hao
    Zhou, Chuan
    Li, Jianxin
    [J]. NEUROCOMPUTING, 2021, 461 : 516 - 529
  • [5] Knowledge Graph Embedding by Double Limit Scoring Loss
    Zhou, Xiaofei
    Niu, Lingfeng
    Zhu, Qiannan
    Zhu, Xingquan
    Liu, Ping
    Tan, Jianlong
    Guo, Li
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (12) : 5825 - 5839
  • [6] Learning knowledge graph embedding with a dual-attention embedding network
    Fang, Haichuan
    Wang, Youwei
    Tian, Zhen
    Ye, Yangdong
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 212
  • [7] Contrastive Predictive Embedding for learning and inference in knowledge graph
    Liu, Chen
    Wei, Zihan
    Zhou, Lixin
    [J]. Knowledge-Based Systems, 2025, 307
  • [8] Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
    Zhao, Yu
    Feng, Huali
    Gallinari, Patrick
    [J]. ENTROPY, 2019, 21 (11)
  • [9] Hyperbolic Knowledge Graph Embedding with Logical Pattern Learning
    Li, Weidong
    Peng, Rong
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] TCKGE: Transformers with contrastive learning for knowledge graph embedding
    Zhang, Xiaowei
    Fang, Quan
    Hu, Jun
    Qian, Shengsheng
    Xu, Changsheng
    [J]. INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2022, 11 (04) : 589 - 597