Gapformer: Graph Transformer with Graph Pooling for Node Classification

被引:0
|
作者
Liu, Chuang [1 ]
Zhan, Yibing [2 ]
Ma, Xueqi [3 ]
Ding, Liang [2 ]
Tao, Dapeng [4 ,5 ]
Wu, Jia [6 ]
Hu, Wenbin [1 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan, Peoples R China
[2] JD Com, JD Explore Acad, Beijing, Peoples R China
[3] Univ Melbourne, Sch Comp & Informat Syst, Melbourne, Australia
[4] Yunnan Univ, Sch Comp Sci, Kunming, Peoples R China
[5] Yunnan Key Lab Media Convergence, Kunming, Peoples R China
[6] Macquarie Univ, Sch Comp, Sydney, Australia
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Transformers (GTs) have proved their advantage in graph-level tasks. However, existing GTs still perform unsatisfactorily on the node classification task due to 1) the overwhelming unrelated information obtained from a vast number of irrelevant distant nodes and 2) the quadratic complexity regarding the number of nodes via the fully connected attention mechanism. In this paper, we present Gapformer, a method for node classification that deeply incorporates Graph Transformer with Graph Pooling. More specifically, Gapformer coarsens the large-scale nodes of a graph into a smaller number of pooling nodes via local or global graph pooling methods, and then computes the attention solely with the pooling nodes rather than all other nodes. In such a manner, the negative influence of the overwhelming unrelated nodes is mitigated while maintaining the long-range information, and the quadratic complexity is reduced to linear complexity with respect to the fixed number of pooling nodes. Extensive experiments on 13 node classification datasets, including homophilic and heterophilic graph datasets, demonstrate the competitive performance of Gapformer over existing Graph Neural Networks and GTs.
引用
收藏
页码:2196 / 2205
页数:10
相关论文
共 50 条
  • [1] Module-based graph pooling for graph classification
    Deng, Sucheng
    Yang, Geping
    Yang, Yiyang
    Gong, Zhiguo
    Chen, Can
    Chen, Xiang
    Hao, Zhifeng
    PATTERN RECOGNITION, 2024, 154
  • [2] An Improved Transformer Model Based Graph Node Classification Method
    Li, Xin
    Chen, Weichang
    Ma, Zhaoyi
    Zhu, Pan
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 7144 - 7149
  • [3] EHG: efficient heterogeneous graph transformer for multiclass node classification
    Man Wang
    Shouqiang Liu
    Zhen Deng
    Advances in Continuous and Discrete Models, 2025 (1):
  • [4] Quasi-CliquePool: Hierarchical Graph Pooling for Graph Classification
    Ali, Waqar
    Vascon, Sebastiano
    Stadelmann, Thilo
    Pelillo, Marcello
    38TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2023, 2023, : 544 - 552
  • [5] Effects of Graph Pooling Layers on Classification with Graph Neural Networks
    Studer, Linda
    Wallau, Jannis
    Ingold, Rolf
    Fischer, Andreas
    2020 7TH SWISS CONFERENCE ON DATA SCIENCE, SDS, 2020, : 57 - 58
  • [6] Graph Multi-Convolution and Attention Pooling for Graph Classification
    Tongji University, Key Laboratory Of Embedded System And Service Computing, Ministry Of Education, Shanghai
    201804, China
    不详
    201804, China
    IEEE Trans Pattern Anal Mach Intell, 12 (10546-10557):
  • [7] Gated Graph Pooling with Self-Loop for Graph Classification
    Fan, Xiaolong
    Gong, Maoguo
    Li, Hao
    Wu, Yue
    Wang, Shanfeng
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [8] Pooling Architecture Search for Graph Classification
    Wei, Lanning
    Zhao, Huan
    Yao, Quanming
    He, Zhiqiang
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2091 - 2100
  • [9] Graph convolutional networks with higher-order pooling for semisupervised node classification
    Lei, Fangyuan
    Liu, Xun
    Jiang, Jianjian
    Liao, Liping
    Cai, Jun
    Zhao, Huimin
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2022, 34 (16):
  • [10] On exploring node-feature and graph-structure diversities for node drop graph pooling
    Liu, Chuang
    Zhan, Yibing
    Yu, Baosheng
    Liu, Liu
    Du, Bo
    Hu, Wenbin
    Liu, Tongliang
    NEURAL NETWORKS, 2023, 167 : 559 - 571