NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification

被引:0
|
作者
Wu, Qitian [1 ,3 ]
Zhao, Wentao [1 ,3 ]
Li, Zenan [1 ,3 ]
Wipf, David [2 ]
Yan, Junchi [1 ,3 ,4 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai, Peoples R China
[2] Amazon Web Serv, Shanghai AI Lab, Shanghai, Peoples R China
[3] SJTU, MoE Key Lab Artificial Intelligence, Shanghai, Peoples R China
[4] Shanghai AI Lab, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks have been extensively studied for learning with inter-connected data. Despite this, recent evidence has revealed GNNs' deficiencies related to over-squashing, heterophily, handling long-range dependencies, edge incompleteness and particularly, the absence of graphs altogether. While a plausible solution is to learn new adaptive topology for message passing, issues concerning quadratic complexity hinder simultaneous guarantees for scalability and precision in large networks. In this paper, we introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes, as an important building block for a pioneering Transformer-style network for node classification on large graphs, dubbed as NODEFORMER. Specifically, the efficient computation is enabled by a kernerlized Gumbel-Softmax operator that reduces the algorithmic complexity to linearity w.r.t. node numbers for learning latent graph structures from large, potentially fully-connected graphs in a differentiable manner. We also provide accompanying theory as justification for our design. Extensive experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs (with up to 2M nodes) and graph-enhanced applications (e.g., image classification) where input graphs are missing. The codes are available at https://github.com/qitianwu/NodeFormer.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Graph convolutional autoencoders with co-learning of graph structure and node attributes
    Wang, Jie
    Liang, Jiye
    Yao, Kaixuan
    Liang, Jianqing
    Wang, Dianhui
    PATTERN RECOGNITION, 2022, 121
  • [42] Mixup for Node and Graph Classification
    Wang, Yiwei
    Wang, Wei
    Liang, Yuxuan
    Cai, Yujun
    Hooi, Bryan
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 3663 - 3674
  • [43] Supervised Graph Contrastive Learning for Few-Shot Node Classification
    Tan, Zhen
    Ding, Kaize
    Guo, Ruocheng
    Liu, Huan
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT II, 2023, 13714 : 394 - 411
  • [44] Multi-Initialization Graph Meta-Learning for Node Classification
    Zhao, Feng
    Wang, Donglin
    Xiang, Xintao
    PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL (ICMR '21), 2021, : 402 - 410
  • [45] Co-Modality Graph Contrastive Learning for Imbalanced Node Classification
    Qian, Yiyue
    Zhang, Chunhui
    Zhang, Yiming
    Wen, Qianlong
    Ye, Yanfang
    Zhang, Chuxu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [46] From Node Interaction to Hop Interaction: New Effective and Scalable Graph Learning Paradigm
    Chen, Jie
    Li, Zilong
    Zhu, Yin
    Zhang, Junping
    Pu, Jian
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7876 - 7885
  • [47] Hierarchical Graph Transformer with Adaptive Node Sampling
    Zhang, Zaixi
    Liu, Qi
    Hu, Qingyong
    Lee, Chee-Kong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [48] EGCN: A Node Classification Model Based on Transformer and Spatial Feature Attention GCN for Dynamic Graph
    Cao, Yunqi
    Chen, Haopeng
    Ruan, Jinteng
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 357 - 368
  • [49] Cell Graph Transformer for Nuclei Classification
    Lou, Wei
    Li, Guanbin
    Wan, Xiang
    Li, Haofeng
    arXiv,
  • [50] Text Graph Transformer for Document Classification
    Zhang, Haopeng
    Zhang, Jiawei
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 8322 - 8327