On the Global Self-attention Mechanism for Graph Convolutional Networks

被引:3
|
作者
Wang, Chen [1 ]
Deng, Chengyuan [1 ]
机构
[1] Rutgers State Univ, Dept Comp Sci, Piscataway, NJ 08854 USA
关键词
D O I
10.1109/ICPR48806.2021.9412456
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Applying Global Self-attention (GSA) mechanism over features has achieved remarkable success on Convolutional Neural Networks (CNNs). However, it is not clear if Graph Convolutional Networks (GCNs) can similarly benefit from such a technique. In this paper, inspired by the similarity between CNNs and GCNs, we study the impact of the Global Self-attention mechanism on GCNs. We find that consistent with the intuition, the GSA mechanism allows GCNs to capture feature-based vertex relations regardless of edge connections; As a result, the GSA mechanism can introduce extra expressive power to the GCNs. Furthermore, we analyze the impacts of the GSA mechanism on the issues of overfilling and over-smoothing. We prove that the GSA mechanism can alleviate both the overfitting and the over-smoothing issues based on some recent technical developments. Experiments on multiple benchmark datasets illustrate both superior expressive power and less significant overfitting and over-smoothing problems for the GSA-augmented GCNs, which corroborate the intuitions and the theoretical results.
引用
收藏
页码:8531 / 8538
页数:8
相关论文
共 50 条
  • [1] Graph convolutional networks with the self-attention mechanism for adaptive influence maximization in social networks
    Tang, Jianxin
    Song, Shihui
    Du, Qian
    Yao, Yabing
    Qu, Jitao
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024,
  • [2] Self-Attention Based Sequential Recommendation With Graph Convolutional Networks
    Seng, Dewen
    Wang, Jingchang
    Zhang, Xuefeng
    [J]. IEEE ACCESS, 2024, 12 : 32780 - 32787
  • [3] Convolutional Self-Attention Networks
    Yang, Baosong
    Wang, Longyue
    Wong, Derek F.
    Chao, Lidia S.
    Tu, Zhaopeng
    [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4040 - 4045
  • [4] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    [J]. IEEE ACCESS, 2022, 10 : 129580 - 129587
  • [5] Bearing remaining useful life prediction using self-adaptive graph convolutional networks with self-attention mechanism
    Wei, Yupeng
    Wu, Dazhong
    Terpenny, Janis
    [J]. MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2023, 188
  • [6] Global Self-Attention as a Replacement for Graph Convolution
    Hussain, Md Shamim
    Zaki, Mohammed J.
    Subramanian, Dharmashankar
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 655 - 665
  • [7] Universal Graph Transformer Self-Attention Networks
    Dai Quoc Nguyen
    Tu Dinh Nguyen
    Dinh Phung
    [J]. COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 193 - 196
  • [8] Convolutional Recurrent Neural Networks with a Self-Attention Mechanism for Personnel Performance Prediction
    Xue, Xia
    Feng, Jun
    Gao, Yi
    Liu, Meng
    Zhang, Wenyu
    Sun, Xia
    Zhao, Aiqi
    Guo, Shouxi
    [J]. ENTROPY, 2019, 21 (12)
  • [9] Combining Gated Convolutional Networks and Self-Attention Mechanism for Speech Emotion Recognition
    Li, Chao
    Jiao, Jinlong
    Zhao, Yiqin
    Zhao, Ziping
    [J]. 2019 8TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2019, : 105 - 109
  • [10] Trajectories prediction in multi-ship encounters: Utilizing graph convolutional neural networks with GRU and Self-Attention Mechanism
    Zeng, Xi
    Gao, Miao
    Zhang, Anmin
    Zhu, Jixiang
    Hu, Yingjun
    Chen, Pengxu
    Chen, Shuai
    Dong, Taoning
    Zhang, Shenwen
    Shi, Peiru
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2024, 120