GTAT: empowering graph neural networks with cross attention

被引:0
|
作者
Shen, Jiahao [1 ]
Ain, Qura Tul [1 ]
Liu, Yaohua [1 ]
Liang, Banqing [1 ]
Qiang, Xiaoli [2 ]
Kou, Zheng [1 ]
机构
[1] Guangzhou Univ, Inst Comp Sci & Technol, Guangzhou 510006, Peoples R China
[2] Guangzhou Univ, Sch Comp Sci & Cyber Engn, Guangzhou 510006, Peoples R China
来源
SCIENTIFIC REPORTS | 2025年 / 15卷 / 01期
基金
中国国家自然科学基金;
关键词
Graph learning; Graph neural networks; Network topology; Cross attention mechanism;
D O I
10.1038/s41598-025-88993-3
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Graph Neural Networks (GNNs) serve as a powerful framework for representation learning on graph-structured data, capturing the information of nodes by recursively aggregating and transforming the neighboring nodes' representations. Topology in graph plays an important role in learning graph representations and impacts the performance of GNNs. However, current methods fail to adequately integrate topological information into graph representation learning. To better leverage topological information and enhance representation capabilities, we propose the Graph Topology Attention Networks (GTAT). Specifically, GTAT first extracts topology features from the graph's structure and encodes them into topology representations. Then, the representations of node and topology are fed into cross attention GNN layers for interaction. This integration allows the model to dynamically adjust the influence of node features and topological information, thus improving the expressiveness of nodes. Experimental results on various graph benchmark datasets demonstrate GTAT outperforms recent state-of-the-art methods. Further analysis reveals GTAT's capability to mitigate the over-smoothing issue, and its increased robustness against noisy data.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Hierarchical Graph Neural Network with Cross-Attention for Cross-Device User Matching
    Taghibakhshi, Ali
    Ma, Mingyuan
    Aithal, Ashwath
    Yilmaz, Onur
    Maron, Haggai
    West, Matthew
    BIG DATA ANALYTICS AND KNOWLEDGE DISCOVERY, DAWAK 2023, 2023, 14148 : 303 - 315
  • [42] Graph Ordering Attention Networks
    Chatzianastasis, Michail
    Lutzeyer, Johannes
    Dasoulas, George
    Vazirgiannis, Michalis
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7006 - 7014
  • [43] A REGULARIZED ATTENTION MECHANISM FOR GRAPH ATTENTION NETWORKS
    Shanthamallu, Uday Shankar
    Jayaraman, J. Thiagarajan
    Spanias, Andreas
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3372 - 3376
  • [44] Sparse Graph Attention Networks
    Ye, Yang
    Ji, Shihao
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (01) : 905 - 916
  • [45] Graph Oriented Attention Networks
    Amine, Ouardi
    Mestari, Mohammed
    IEEE ACCESS, 2024, 12 : 47057 - 47067
  • [46] Signed Graph Attention Networks
    Huang, Junjie
    Shen, Huawei
    Hou, Liang
    Cheng, Xueqi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: WORKSHOP AND SPECIAL SESSIONS, 2019, 11731 : 566 - 577
  • [47] Graph convolutional neural networks with global attention for improved materials property prediction
    Louis, Steph-Yves
    Zhao, Yong
    Nasiri, Alireza
    Wang, Xiran
    Song, Yuqi
    Liu, Fei
    Hu, Jianjun
    PHYSICAL CHEMISTRY CHEMICAL PHYSICS, 2020, 22 (32) : 18141 - 18148
  • [48] Application of graph frequency attention convolutional neural networks in depression treatment response
    Lu, Zihe
    Wang, Jialin
    Wang, Fengqin
    Wu, Zhoumin
    FRONTIERS IN PSYCHIATRY, 2023, 14
  • [49] Infrared spectra prediction using attention-based graph neural networks
    Saquer, Naseem
    Iqbal, Razib
    Ellis, Joshua D.
    Yoshimatsu, Keiichi
    DIGITAL DISCOVERY, 2024, 3 (03): : 602 - 609
  • [50] Hybrid Structure Encoding Graph Neural Networks with Attention Mechanism for Link Prediction
    Hu, Man
    Sun, Dezhi
    You, Fucheng
    Xiao, Han
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 417 - 424