Global-local graph attention: unifying global and local attention for node classification

被引:0
|
作者
Lin, Keao [1 ]
Xie, Xiaozhu [1 ]
Weng, Wei [1 ,2 ]
Du, Xiaofeng [1 ]
机构
[1] Xiamen Univ Technol, Coll Comp & Informat Engn, Xiamen 361024, Peoples R China
[2] Fujian Key Lab Pattern Recognit & Image Understand, Xiamen 361024, Peoples R China
来源
关键词
CONVOLUTIONAL NETWORK; NEURAL-NETWORK;
D O I
10.1093/comjnl/bxae060
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Graph Neural Networks (GNNs) are deep learning models specifically designed for analyzing graph-structured data, capturing complex relationships and structures to improve analysis and prediction. A common task in GNNs is node classification, where each node in the graph is assigned a predefined category. The Graph Attention Network (GAT) is a popular variant of GNNs known for its ability to capture complex dependencies by assigning importance weights to nodes during information aggregation. However, the GAT's reliance on local attention mechanisms limits its effectiveness in capturing global information and long-range dependencies. To address this limitation, we propose a new attention mechanism called Global-Local Graph Attention (GLGA). Our mechanism enables the GAT to capture long-range dependencies and global graph structures while maintaining its ability to focus on local interactions. We evaluate our algorithm on three citation datasets (Cora, Citeseer, and Pubmed) using multiple metrics, demonstrating its superiority over other baseline models. The proposed GLGA mechanism has been proven to be an effective solution for improving node classification tasks.
引用
收藏
页码:2959 / 2969
页数:11
相关论文
共 50 条
  • [31] All the attention you need: Global-local, spatial-channel attention for image retrieval
    Song, Chull Hwan
    Han, Hye Joo
    Avrithis, Yannis
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 439 - 448
  • [32] DEFICIT IN SHIFTS OF ATTENTION TO DIFFERENT LEVELS OF GLOBAL-LOCAL STIMULI IN SCHIZOPHRENIA
    Matsui, Mie
    Takeuchi, A.
    Katagiri, M.
    Suzuki, M.
    Murohashi, H.
    SCHIZOPHRENIA BULLETIN, 2011, 37 : 247 - 247
  • [33] Global-Local Self-Attention Based Transformer for Speaker Verification
    Xie, Fei
    Zhang, Dalong
    Liu, Chengming
    APPLIED SCIENCES-BASEL, 2022, 12 (19):
  • [34] Global Attention-Based Graph Neural Networks for Node Classification
    Chen, Jiusheng
    Fang, Chengyuan
    Zhang, Xiaoyu
    NEURAL PROCESSING LETTERS, 2023, 55 (04) : 4127 - 4150
  • [35] Global Attention-Based Graph Neural Networks for Node Classification
    Jiusheng Chen
    Chengyuan Fang
    Xiaoyu Zhang
    Neural Processing Letters, 2023, 55 : 4127 - 4150
  • [36] Global-local graph convolutional broad network for hyperspectral image classification
    Chu, Yonghe
    Cao, Jun
    Huang, Jiashuang
    Ju, Hengrong
    Liu, Guangen
    Cao, Heling
    Ding, Weiping
    Applied Soft Computing, 2025, 170
  • [37] Global-Local Attention-Based Butterfly Vision Transformer for Visualization-Based Malware Classification
    Belal, Mohamad Mulham
    Sundaram, Divya Meena
    IEEE ACCESS, 2023, 11 : 69337 - 69355
  • [38] A novel global-local block spatial-spectral fusion attention model for hyperspectral image classification
    Zhang, Lihao
    Zeng, Yiliang
    Zhao, Jiahong
    Lan, Jinhui
    REMOTE SENSING LETTERS, 2022, 13 (04) : 343 - 351
  • [39] GLADS: A global-local attention data selection model for multimodal multitask encrypted traffic classification of IoT
    Dai, Jianbang
    Xu, Xiaolong
    Xiao, Fu
    COMPUTER NETWORKS, 2023, 225
  • [40] A global-local attention network for uncertainty analysis of ground penetrating radar modeling
    Zhao, Yunjie
    Cheng, Xi
    Zhang, Taihong
    Wang, Lei
    Shao, Wei
    Wiart, Joe
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2023, 234