GTAT: empowering graph neural networks with cross attention

被引:0
|
作者
Shen, Jiahao [1 ]
Ain, Qura Tul [1 ]
Liu, Yaohua [1 ]
Liang, Banqing [1 ]
Qiang, Xiaoli [2 ]
Kou, Zheng [1 ]
机构
[1] Guangzhou Univ, Inst Comp Sci & Technol, Guangzhou 510006, Peoples R China
[2] Guangzhou Univ, Sch Comp Sci & Cyber Engn, Guangzhou 510006, Peoples R China
来源
SCIENTIFIC REPORTS | 2025年 / 15卷 / 01期
基金
中国国家自然科学基金;
关键词
Graph learning; Graph neural networks; Network topology; Cross attention mechanism;
D O I
10.1038/s41598-025-88993-3
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Graph Neural Networks (GNNs) serve as a powerful framework for representation learning on graph-structured data, capturing the information of nodes by recursively aggregating and transforming the neighboring nodes' representations. Topology in graph plays an important role in learning graph representations and impacts the performance of GNNs. However, current methods fail to adequately integrate topological information into graph representation learning. To better leverage topological information and enhance representation capabilities, we propose the Graph Topology Attention Networks (GTAT). Specifically, GTAT first extracts topology features from the graph's structure and encodes them into topology representations. Then, the representations of node and topology are fed into cross attention GNN layers for interaction. This integration allows the model to dynamically adjust the influence of node features and topological information, thus improving the expressiveness of nodes. Experimental results on various graph benchmark datasets demonstrate GTAT outperforms recent state-of-the-art methods. Further analysis reveals GTAT's capability to mitigate the over-smoothing issue, and its increased robustness against noisy data.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] SAlign: A Graph Neural Attention Framework for Aligning Structurally Heterogeneous Networks
    Saxena, Shruti
    Chandra, Joydeep
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2023, 77 : 949 - 969
  • [32] Deep multi-graph neural networks with attention fusion for recommendation
    Song, Yuzhi
    Ye, Hailiang
    Li, Ming
    Cao, Feilong
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 191
  • [33] Global Attention-Based Graph Neural Networks for Node Classification
    Chen, Jiusheng
    Fang, Chengyuan
    Zhang, Xiaoyu
    NEURAL PROCESSING LETTERS, 2023, 55 (04) : 4127 - 4150
  • [34] Global Attention-Based Graph Neural Networks for Node Classification
    Jiusheng Chen
    Chengyuan Fang
    Xiaoyu Zhang
    Neural Processing Letters, 2023, 55 : 4127 - 4150
  • [35] Learning From Crowds Using Graph Neural Networks With Attention Mechanism
    Zhang, Jing
    Wu, Ming
    Sun, Zeyi
    Zhou, Cangqi
    IEEE TRANSACTIONS ON BIG DATA, 2025, 11 (01) : 86 - 98
  • [36] Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation
    Zhang, Shuo
    Xie, Lei
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 1395 - 1402
  • [37] Learning to Execute Programs with Instruction Pointer Attention Graph Neural Networks
    Bieber, David
    Sutton, Charles
    Larochelle, Hugo
    Tarlow, Daniel
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [38] Graph Neural Networks for Cross-Camera Data Association
    Luna, Elena
    SanMiguel, Juan C.
    Martinez, Jose M.
    Carballeira, Pablo
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (02) : 589 - 601
  • [39] A link prediction method for Chinese financial event knowledge graph based on graph attention networks and convolutional neural networks
    Cheng, Haitao
    Wang, Ke
    Tan, Xiaoying
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 138
  • [40] No Change, No Gain: Empowering Graph Neural Networks with Expected Model Change Maximization for Active Learning
    Song, Zixing
    Zhang, Yifei
    King, Irwin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,