SAEA: Self-Attentive Heterogeneous Sequence Learning Model for Entity Alignment

被引:6
|
作者
Chen, Jia [1 ]
Gu, Binbin [2 ]
Li, Zhixu [1 ,3 ]
Zhao, Pengpeng [1 ]
Liu, An [1 ]
Zhao, Lei [1 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Peoples R China
[2] Univ Calif Santa Cruz, Santa Cruz, CA USA
[3] IFLYTEK Res, Suzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph; Entity alignment; Degree-aware random walk;
D O I
10.1007/978-3-030-59410-7_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of entity alignment in knowledge graphs. Previous works mainly focus on two aspects: One is to improve the TransE-based models which mostly only consider triple-level structural information i.e. relation triples or to make use of graph convolutional networks holding the assumption that equivalent entities are usually neighbored by some other equivalent entities. The other is to incorporate external features, such as attributes types, attribute values, entity names and descriptions to enhance the original relational model. However, the long-term structural dependencies between entities have not been exploited well enough and sometimes external resources are incomplete and unavailable. These will impair the accuracy and robustness of combinational models that use relations and other types of information, especially when iteration is performed. To better explore structural information between entities, we novelly propose a Self-Attentive heterogeneous sequence learning model for Entity Alignment (SAEA) that allows us to capture long-term structural dependencies within entities. Furthermore, considering low-degree entities and relations appear much less in sequences prodeced by traditional random walk methods, we design a degree-aware random walk to generate heterogeneous sequential data for self-attentive learning. To evaluate our proposed model, we conduct extensive experiments on real-world datasets. The experimental results show that our method outperforms various state-of-the-art entity alignment models using relation triples only.
引用
收藏
页码:452 / 467
页数:16
相关论文
共 50 条
  • [1] Self-Attentive Pooling for Efficient Deep Learning
    Chen, Fang
    Datta, Gourav
    Kundu, Souvik
    Beerel, Peter A.
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 3963 - 3972
  • [2] Sequential Self-Attentive Model for Knowledge Tracing
    Zhang, Xuelong
    Zhang, Juntao
    Lin, Nanzhou
    Yang, Xiandi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 318 - 330
  • [3] Self-attentive Rationalization for Interpretable Graph Contrastive Learning
    Li, Sihang
    Luo, Yanchen
    Zhang, An
    Wang, Xiang
    Li, Longfei
    Zhou, Jun
    Chua, Tat-seng
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2025, 19 (02)
  • [4] A self-attentive model for tracing knowledge and engagement in parallel
    Jiang, Hua
    Xiao, Bing
    Luo, Yintao
    Ma, Junliang
    PATTERN RECOGNITION LETTERS, 2023, 165 : 25 - 32
  • [5] Improving Disfluency Detection by Self-Training a Self-Attentive Model
    Lou, Paria Jamshid
    Johnson, Mark
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3754 - 3763
  • [6] Self-Attentive Contrastive Learning for Conditioned Periocular and Face Biometrics
    Ng, Tiong-Sik
    Chai, Jacky Chen Long
    Low, Cheng-Yaw
    Beng Jin Teoh, Andrew
    IEEE Transactions on Information Forensics and Security, 2024, 19 : 3251 - 3264
  • [7] Self-Attentive Attributed Network Embedding Through Adversarial Learning
    Yu, Wenchao
    Cheng, Wei
    Aggarwal, Charu
    Zong, Bo
    Chen, Haifeng
    Wang, Wei
    2019 19TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2019), 2019, : 758 - 767
  • [8] Self-Attentive Contrastive Learning for Conditioned Periocular and Face Biometrics
    Ng, Tiong-Sik
    Chai, Jacky Chen Long
    Low, Cheng-Yaw
    Teoh, Andrew Beng Jin
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 : 3251 - 3264
  • [9] A Self-Attentive Model with Gate Mechanism for Spoken Language Understanding
    Li, Changliang
    Li, Liang
    Qi, Ji
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3824 - 3833
  • [10] Accurate disaster entity recognition based on contextual embeddings in self-attentive BiLSTM-CRF
    Hafsa, Noor E.
    Alzoubi, Hadeel Mohammed
    Almutlq, Atikah Saeed
    PLOS ONE, 2025, 20 (03):