NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification

被引:0
|
作者
Wu, Qitian [1 ,3 ]
Zhao, Wentao [1 ,3 ]
Li, Zenan [1 ,3 ]
Wipf, David [2 ]
Yan, Junchi [1 ,3 ,4 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai, Peoples R China
[2] Amazon Web Serv, Shanghai AI Lab, Shanghai, Peoples R China
[3] SJTU, MoE Key Lab Artificial Intelligence, Shanghai, Peoples R China
[4] Shanghai AI Lab, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks have been extensively studied for learning with inter-connected data. Despite this, recent evidence has revealed GNNs' deficiencies related to over-squashing, heterophily, handling long-range dependencies, edge incompleteness and particularly, the absence of graphs altogether. While a plausible solution is to learn new adaptive topology for message passing, issues concerning quadratic complexity hinder simultaneous guarantees for scalability and precision in large networks. In this paper, we introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes, as an important building block for a pioneering Transformer-style network for node classification on large graphs, dubbed as NODEFORMER. Specifically, the efficient computation is enabled by a kernerlized Gumbel-Softmax operator that reduces the algorithmic complexity to linearity w.r.t. node numbers for learning latent graph structures from large, potentially fully-connected graphs in a differentiable manner. We also provide accompanying theory as justification for our design. Extensive experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs (with up to 2M nodes) and graph-enhanced applications (e.g., image classification) where input graphs are missing. The codes are available at https://github.com/qitianwu/NodeFormer.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Graph alternate learning for robust graph neural networks in node classification
    Zhang, Baoliang
    Guo, Xiaoxin
    Tu, Zhenchuan
    Zhang, Jia
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (11): : 8723 - 8735
  • [22] Graph alternate learning for robust graph neural networks in node classification
    Baoliang Zhang
    Xiaoxin Guo
    Zhenchuan Tu
    Jia Zhang
    Neural Computing and Applications, 2022, 34 : 8723 - 8735
  • [23] FGTL: Federated Graph Transfer Learning for Node Classification
    Mai, Chengyuan
    Liao, Tianchi
    Chen, Chuan
    Zheng, Zibin
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 19 (01)
  • [24] Content and structure based attention for graph node classification
    Chen Y.
    Xie X.-Z.
    Weng W.
    Journal of Intelligent and Fuzzy Systems, 2024, 46 (04): : 8329 - 8343
  • [25] Structure-Aware Transformer for Graph Representation Learning
    Chen, Dexiong
    O'Bray, Leslie
    Borgwardt, Karsten
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [26] Balancing structure and position information in Graph Transformer network with a learnable node
    Hoang, Thi Linh
    Ta, Viet Cuong
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [27] NHSH: Graph Hybrid Learning with Node Homophily and Spectral Heterophily for Node Classification
    Liu, Kang
    Dai, Wenqing
    Liu, Xunyuan
    Kang, Mengtao
    Ji, Runshi
    SYMMETRY-BASEL, 2025, 17 (01):
  • [28] Open-World Graph Active Learning for Node Classification
    Xu, Hui
    Xiang, Liyao
    Ou, Junjie
    Weng, Yuting
    Wang, Xinbing
    Zhou, Chenghu
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (02)
  • [29] Nonlinear Graph Learning-Convolutional Networks for Node Classification
    Chen, Linjun
    Liu, Xingyi
    Li, Zexin
    NEURAL PROCESSING LETTERS, 2022, 54 (04) : 2727 - 2736
  • [30] Nonlinear Graph Learning-Convolutional Networks for Node Classification
    Linjun Chen
    Xingyi Liu
    Zexin Li
    Neural Processing Letters, 2022, 54 : 2727 - 2736