Fusing structural information with knowledge enhanced text representation for knowledge graph completion

被引:0
|
作者
Kang Tang
Shasha Li
Jintao Tang
Dong Li
Pancheng Wang
Ting Wang
机构
[1] National University of Defense Technology,College of Computer Science and Technology
来源
关键词
Neural networks; Knowledge graphs; Knowledge graph completion;
D O I
暂无
中图分类号
学科分类号
摘要
Although knowledge graphs store a large number of facts in the form of triplets, they are still limited by incompleteness. Hence, Knowledge Graph Completion (KGC), defined as inferring missing entities or relations based on observed facts, has long been a fundamental issue for various knowledge driven downstream applications. Prevailing KG embedding methods for KGC like TransE rely solely on mining structural information of existing facts, thus failing to handle generalization issue as they are inapplicable to unseen entities. Recently, a series of researches employ pre-trained encoders to learn textual representation for triples i.e., textual-encoding methods. While exhibiting great generalization for unseen entities, they are still inferior compared with above KG embedding based ones. In this paper, we devise a novel textual-encoding learning framework for KGC. To enrich textual prior knowledge for more informative prediction, it features three hierarchical maskings which can utilize far contexts of input text so that textual prior knowledge can be elicited. Besides, to solve predictive ambiguity caused by improper relational modeling, a relational-aware structure learning scheme is applied based on textual embeddings. Extensive experimental results on several popular datasets suggest the effectiveness of our approach even compared with recent state-of-the-arts in this task.
引用
收藏
页码:1316 / 1333
页数:17
相关论文
共 50 条
  • [21] A Model of Text-Enhanced Knowledge Graph Representation Learning with Collaborative Attention
    Wang, Yashen
    Zhang, Huanhuan
    Xie, Haiyong
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 236 - 251
  • [22] A Model of Text-Enhanced Knowledge Graph Representation Learning With Mutual Attention
    Wang, Yashen
    Zhang, Huanhuan
    Shi, Ge
    Liu, Zhirun
    Zhou, Qiang
    IEEE ACCESS, 2020, 8 : 52895 - 52905
  • [23] Incorporating structural knowledge into language models for open knowledge graph completion
    Song, Xin
    Wang, Ye
    Zhou, Bin
    Wang, Haiyang
    Huang, Yanyi
    Gao, Liqun
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2025, 28 (01):
  • [24] Graph attention network with dynamic representation of relations for knowledge graph completion
    Zhang, Xin
    Zhang, Chunxia
    Guo, Jingtao
    Peng, Cheng
    Niu, Zhendong
    Wu, Xindong
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 219
  • [25] Structure Enhanced Path Reasoning for Knowledge Graph Completion
    Wang, Yilin
    Huang, Zhen
    Hu, Minghao
    Li, Dongsheng
    Lu, Xicheng
    Luo, Wei
    Yang, Dong
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2023, 2023
  • [26] Fusing topology contexts and logical rules in language models for knowledge graph completion
    Lin, Qika
    Mao, Rui
    Liu, Jun
    Xu, Fangzhi
    Cambria, Erik
    INFORMATION FUSION, 2023, 90 : 253 - 264
  • [27] Enriched entity representation of knowledge graph for text generation
    Kaile Shi
    Xiaoyan Cai
    Libin Yang
    Jintao Zhao
    Complex & Intelligent Systems, 2023, 9 : 2019 - 2030
  • [28] Enriched entity representation of knowledge graph for text generation
    Shi, Kaile
    Cai, Xiaoyan
    Yang, Libin
    Zhao, Jintao
    COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (02) : 2019 - 2030
  • [29] Graph-based Text Representation and Knowledge Discovery
    Jin, Wei
    Srihari, Rohini K.
    APPLIED COMPUTING 2007, VOL 1 AND 2, 2007, : 807 - 811
  • [30] Representation Learning with Ordered Relation Paths for Knowledge Graph Completion
    Zhu, Yao
    Liu, Hongzhi
    Wu, Zhonghai
    Song, Yang
    Zhang, Tao
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 2662 - 2671