Simple and deep graph attention networks

被引:1
|
作者
Su, Guangxin [1 ]
Wang, Hanchen [2 ]
Zhang, Ying [3 ]
Zhang, Wenjie [1 ]
Lin, Xuemin [4 ]
机构
[1] Univ New South Wales, Sydney, NSW 2052, Australia
[2] Univ Technol Sydney, Ultimo, NSW 2007, Australia
[3] Zhejiang Gongshang Univ, Hangzhou 314423, Zhejiang, Peoples R China
[4] Shanghai Jiao Tong Univ, Shanghai 200240, Peoples R China
关键词
Deep graph neural networks; Oversmoothing; Graph attention networks; Attention mechanism;
D O I
10.1016/j.knosys.2024.111649
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Attention Networks (GATs) and Graph Convolutional Neural Networks (GCNs) are two state-of-the-art architectures in Graph Neural Networks (GNNs). It is well known that both models suffer from performance degradation when more GNN layers are stacked, and many works have been devoted to address this problem. We notice that main research efforts in the line focus on the GCN models, and their techniques cannot well fit the GAT models due to the inherent difference between these two architectures. In GAT, the attention mechanism is limited as it ignores the overwhelming propagation from certain nodes as the number of layers increases. To sufficiently utilize the expressive power of GAT, we propose a new version of GAT named Layer -wise Self -adaptive GAT (LSGAT), which can effectively alleviate the oversmoothing issue in deep GAT and is strictly more expressive than GAT. We redesign the attention coefficients computation mechanism adaptively adjusted by layer depth, which considers both immediate neighboring and non -adjacent nodes from a global view. The experimental evaluation confirms that LSGAT consistently achieves better results on node classification tasks over relevant counterparts.
引用
下载
收藏
页数:9
相关论文
共 50 条
  • [21] PERSONALIZED PAGERANK GRAPH ATTENTION NETWORKS
    Choi, Julie
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3578 - 3582
  • [22] GRAPH ATTENTION NETWORKS FOR SPEAKER VERIFICATION
    Jung, Jee-weon
    Heo, Hee-Soo
    Yu, Ha-Jin
    Chung, Joon Son
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6149 - 6153
  • [23] GAEN: Graph Attention Evolving Networks
    Shi, Min
    Huang, Yu
    Zhu, Xingquan
    Tang, Yufei
    Zhuang, Yuan
    Liu, Jianxun
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1541 - 1547
  • [24] Graph Attention Networks with Positional Embeddings
    Ma, Liheng
    Rabbany, Reihaneh
    Romero-Soriano, Adriana
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT I, 2021, 12712 : 514 - 527
  • [25] Deep linear graph attention model for attributed graph clustering
    Liao, Huifa
    Hu, Jie
    Li, Tianrui
    Du, Shengdong
    Peng, Bo
    KNOWLEDGE-BASED SYSTEMS, 2022, 246
  • [26] Knowledge Graph Embedding via Graph Attenuated Attention Networks
    Wang, Rui
    Li, Bicheng
    Hu, Shengwei
    Du, Wenqian
    Zhang, Min
    IEEE ACCESS, 2020, 8 (5212-5224) : 5212 - 5224
  • [27] Deep Clustering by Graph Attention Contrastive Learning
    Liu, Ming
    Liu, Cong
    Fu, Xiaoyuan
    Wang, Jing
    Li, Jiankun
    Qi, Qi
    Liao, Jianxin
    ELECTRONICS, 2023, 12 (11)
  • [28] Android malware detection method based on graph attention networks and deep fusion of multimodal features
    Chen, Shaojie
    Lang, Bo
    Liu, Hongyu
    Chen, Yikai
    Song, Yucai
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [29] Dual Graph Attention Networks for Deep Latent Representation of Multifaceted Social Effects in Recommender Systems
    Wu, Qitian
    Zhang, Hengrui
    Gao, Xiaofeng
    He, Peng
    Weng, Paul
    Gao, Han
    Chen, Guihai
    WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, : 2091 - 2102
  • [30] Dynamic Job-Shop Scheduling via Graph Attention Networks and Deep Reinforcement Learning
    Liu, Chien-Liang
    Tseng, Chun-Jan
    Weng, Po-Hao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (06) : 8662 - 8672