HAHE: Hierarchical Attention for Hyper-Relational Knowledge Graphs in Global and Local Level

被引:0
|
作者
Luo, Haoran [1 ]
E, Haihong [1 ]
Yang, Yuhao [2 ]
Guo, Yikai [3 ]
Sun, Mingzhi [1 ]
Yao, Tianyu [1 ]
Tang, Zichen [1 ]
Wan, Kaiyang [1 ]
Song, Meina [1 ]
Lin, Wei [4 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Beijing, Peoples R China
[2] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing, Peoples R China
[3] Beijing Inst Comp Technol & Applicat, Beijing, Peoples R China
[4] Inspur Grp Co Ltd, Jinan, Peoples R China
基金
北京市自然科学基金; 美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Link Prediction on Hyper-relational Knowledge Graphs (HKG) is a worthwhile endeavor. HKG consists of hyper-relational facts (HFacts), composed of a main triple and several auxiliary attribute-value qualifiers, which can effectively represent factually comprehensive information. The internal structure of HKG can be represented as a hypergraphbased representation globally and a semantic sequence-based representation locally. However, existing research seldom simultaneously models the graphical and sequential structure of HKGs, limiting HKGs' representation. To overcome this limitation, we propose a novel Hierarchical Attention model for HKG Embedding (HAHE), including global-level and local-level attention. The global-level attention can model the graphical structure of HKG using hypergraph dual-attention layers, while the local-level attention can learn the sequential structure inside H-Facts via heterogeneous self-attention layers. Experiment results indicate that HAHE achieves state-of-the-art performance in link prediction tasks on HKG standard datasets. In addition, HAHE addresses the issue of HKG multi-position prediction for the first time, increasing the applicability of the HKG link prediction task. Our code is publicly available(1).
引用
收藏
页码:8095 / 8107
页数:13
相关论文
共 50 条
  • [41] Attention to local and global levels of hierarchical Navon figures affects rapid scene categorization
    Brand, John
    Johnson, Aaron P.
    FRONTIERS IN PSYCHOLOGY, 2014, 5
  • [42] Similarity between target and irrelevant level in global and local processing of hierarchical stimuli
    Blanca, MJ
    López, D
    Luna, R
    Zalabardo, C
    Rando, B
    PSICOTHEMA, 2000, 12 : 77 - 80
  • [43] (Re)Making Heritage Policy in Hong Kong: A Relational Politics of Global Knowledge and Local Innovation
    Barber, Lachlan
    URBAN STUDIES, 2014, 51 (06) : 1179 - 1195
  • [44] Unsupervised learning aided by clustering and local-global hierarchical analysis in knowledge exploration
    Zhang, Yihao
    Orgun, Mehmet A.
    Lin, Weiqiang
    Journal of Digital Information Management, 2007, 5 (04): : 237 - 246
  • [45] Age-Related Change in Shifting Attention Between Global and Local Levels of Hierarchical Stimuli
    Huizinga, Mariette
    Burack, Jacob A.
    Van der Molen, Maurits W.
    JOURNAL OF COGNITION AND DEVELOPMENT, 2010, 11 (04) : 408 - 436
  • [46] Individuals with Asperger’s Disorder Exhibit Difficulty in Switching Attention from a Local Level to a Global Level
    Masatoshi Katagiri
    Tetsuko Kasai
    Yoko Kamio
    Harumitsu Murohashi
    Journal of Autism and Developmental Disorders, 2013, 43 : 395 - 403
  • [47] Individuals with Asperger's Disorder Exhibit Difficulty in Switching Attention from a Local Level to a Global Level
    Katagiri, Masatoshi
    Kasai, Tetsuko
    Kamio, Yoko
    Murohashi, Harumitsu
    JOURNAL OF AUTISM AND DEVELOPMENTAL DISORDERS, 2013, 43 (02) : 395 - 403
  • [48] Global and local co-attention networks enhanced by learning state for knowledge tracing
    Wang, Xinhua
    Cao, Yibang
    Xu, Liancheng
    Sun, Ke
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [49] Attention switching between global and local elements: Distractor category and the level repetition effect
    Shedden, JM
    Marsman, IA
    Paul, MP
    Nelson, A
    VISUAL COGNITION, 2003, 10 (04) : 433 - 470
  • [50] The Effect of Color on Letter Discrimination Can Be Modulated by the Global/Local Processing and Attention Level
    Sun, Meng
    Zhang, Xiaorong
    Wang, Jiangmeng
    Liu, Hailan
    Zhang, Qin
    Cui, Lixia
    SAGE OPEN, 2020, 10 (02):