Sparse graphs-based dynamic attention networks

被引:0
|
作者
Chen, Runze [1 ]
Lin, Kaibiao [1 ]
Hong, Binsheng [1 ]
Zhang, Shandan [1 ]
Yang, Fan [2 ]
机构
[1] Xiamen Univ Technol, Dept Comp Sci & Technol, Xiamen 361024, Peoples R China
[2] Xiamen Univ, Dept Automat, Xiamen 361005, Peoples R China
关键词
Graph neural networks; Graph attention networks; Sparse graphs; Dynamic attention;
D O I
10.1016/j.heliyon.2024.e35938
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
In previous research, the prevailing assumption was that Graph Neural Networks (GNNs) precisely depicted the interconnections among nodes within the graph's architecture. Nonetheless, real-world graph datasets are often rife with noise, elements that can disseminate through the network and ultimately affect the outcome of the downstream tasks. Facing the complex fabric of real-world graphs and the myriad potential disturbances, we introduce the Sparse Graph Dynamic Attention Networks (SDGAT) in this research. SDGAT employs the L-0 regularization technique to achieve a sparse representation of the graph structure, which eliminates noise and generates a more concise sparse graph. Building upon this foundation, the model integrates a dynamic attention mechanism, allowing it to selectively focus on key nodes and edges, filter out irrelevant data, and simultaneously facilitate effective feature aggregation with important neighbors. To evaluate the performance of SDGAT, we conducted experiments on three citation datasets and compared its performance against commonly employed models. The outcomes indicate that SDGAT excels in node classification tasks, notably on the Cora dataset, with an accuracy rate of 85.29%, marking a roughly 3% enhancement over the majority of baseline models. The experimental findings provide evidence that SDGAT delivers effective performance on all three citation datasets, underscoring the efficacy of the dynamic attention network built upon a sparse graph.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Dynamic Sparse Attention for Scalable Transformer Acceleration
    Liu, Liu
    Qu, Zheng
    Chen, Zhaodong
    Tu, Fengbin
    Ding, Yufei
    Xie, Yuan
    IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (12) : 3165 - 3178
  • [22] Rescorla–Wagner Models with Sparse Dynamic Attention
    Joel Nishimura
    Amy L. Cochran
    Bulletin of Mathematical Biology, 2020, 82
  • [23] Dynamic choosability of triangle-free graphs and sparse random graphs
    Kim, Jaehoon
    Ok, Seongmin
    JOURNAL OF GRAPH THEORY, 2018, 87 (03) : 347 - 355
  • [24] Hyperspectral face recognition based on sparse spectral attention deep neural networks
    Xie, Zhihua
    Li, Yi
    Niu, Jieyi
    Shi, Ling
    Wang, Zhipeng
    Lu, Guoyu
    OPTICS EXPRESS, 2020, 28 (24) : 36286 - 36303
  • [25] Sparse co-attention visual question answering networks based on thresholds
    Zihan Guo
    Dezhi Han
    Applied Intelligence, 2023, 53 : 586 - 600
  • [26] Sparse co-attention visual question answering networks based on thresholds
    Guo, Zihan
    Han, Dezhi
    APPLIED INTELLIGENCE, 2023, 53 (01) : 586 - 600
  • [27] Sparse-Dyn: Sparse dynamic graph multirepresentation learning via event-based sparse temporal attention network
    Pang, Yan
    Shan, Ai
    Wang, Zhen
    Wang, Mengyu
    Li, Jianwei
    Zhang, Ji
    Huang, Teng
    Liu, Chao
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (11) : 8770 - 8789
  • [28] Fast Attention Over Long SequencesWith Dynamic Sparse Flash Attention
    Pagliardini, Matteo
    Paliotta, Daniele
    Jaggi, Martin
    Fleuret, Francois
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [29] Dynamic graphs attention for ocean variable forecasting
    Wang, Junhao
    Sun, Zhengya
    Yuan, Chunxin
    Li, Wenhui
    Liu, An-An
    Wei, Zhiqiang
    Yin, Bo
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [30] Graph signal processing on dynamic graphs based on temporal-attention product 
    Geng, Ru
    Gao, Yixian
    Zhang, Hong-Kun
    Zu, Jian
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 67