Joint embedding in hierarchical distance and semantic representation learning for link prediction

被引:0
|
作者
Liu, Jin [1 ]
Chen, Jianye [1 ]
Fan, Chongfeng [1 ]
Zhou, Fengyu [1 ]
机构
[1] Shandong Univ, Sch Control Sci & Engn, Jinan 250011, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Hierarchy information; Distance measurement; Semantic measurement; Link prediction; Knowledge graph representation; KNOWLEDGE GRAPH;
D O I
10.1016/j.bdr.2024.100495
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The link prediction task aims to predict missing entities or relations in the knowledge graph and is essential for the downstream application. Existing well-known models deal with this task by mainly focusing on representing knowledge graph triplets in the distance space or semantic space. However, they can not fully capture the information of head and tail entities, nor even make good use of hierarchical level information. Thus, in this paper, we propose a novel knowledge graph embedding model for the link prediction task, namely, HIE, which models each triplet (h, r , t ) into distance measurement space and semantic measurement space, simultaneously. Moreover, HIE is introduced into hierarchical-aware space to leverage rich hierarchical information of entities and relations for better representation learning. Specifically, we apply distance transformation operation on the head entity in distance space to obtain the tail entity instead of translation-based or rotation-based approaches. Experimental results of HIE on four real-world datasets show that HIE outperforms several existing state-of-theart knowledge graph embedding methods on the link prediction task and deals with complex relations accurately.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Joint Distance and Representation Learning for Sign Language Videos
    Kose, Oyku Deniz
    Saraclar, Murat
    2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [32] HSNR: A Network Representation Learning Algorithm Using Hierarchical Structure Embedding
    YE Zhonglin
    ZHAO Haixing
    ZHU Yu
    XIAO Yuzhi
    Chinese Journal of Electronics, 2020, 29 (06) : 1141 - 1152
  • [33] Hierarchical Hyperedge Embedding-Based Representation Learning for Group Recommendation
    Guo, Lei
    Yin, Hongzhi
    Chen, Tong
    Zhang, Xiangliang
    Zheng, Kai
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2022, 40 (01)
  • [34] Enhancing link prediction through node embedding and ensemble learning
    Chen, Zhongyuan
    Wang, Yongji
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (12) : 7697 - 7715
  • [35] Hierarchical-aware relation rotational knowledge graph embedding for link prediction
    Wang, Shensi
    Fu, Kun
    Sun, Xian
    Zhang, Zequn
    Li, Shuchao
    Jin, Li
    NEUROCOMPUTING, 2021, 458 (458) : 259 - 270
  • [36] Attributed Network Representation Learning Approaches for Link Prediction
    Masrour, Farzan
    Tan, Pang-Ning
    Esfahanian, Abdol-Hossein
    VanDam, Courtland
    2018 IEEE/ACM INTERNATIONAL CONFERENCE ON ADVANCES IN SOCIAL NETWORKS ANALYSIS AND MINING (ASONAM), 2018, : 560 - 563
  • [37] MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering
    Chekalina, Viktoriia
    Razzhigaev, Anton
    Sayapin, Albert
    Frolov, Evgeny
    Panchenko, Alexander
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): STUDENT RESEARCH WORKSHOP, 2022, : 355 - 365
  • [38] Joint Word Representation Learning Using a Corpus and a Semantic Lexicon
    Bollegala, Danushka
    Mohammed, Alsuhaibani
    Maehara, Takanori
    Kawarabayashi, Ken-ichi
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2690 - 2696
  • [39] Deep Representation of Hierarchical Semantic Attributes for Zero-shot Learning
    Zhang, Zhaocheng
    Yang, Gang
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [40] High-Order Joint Embedding for Multi-Level Link Prediction
    Yuan, Yubai
    Qu, Annie
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (543) : 1692 - 1706