A learnable sampling method for scalable graph neural networks

被引:4
|
作者
Zhao, Weichen [1 ]
Guo, Tiande [1 ,2 ]
Yu, Xiaoxi [3 ]
Han, Congying [1 ,2 ]
机构
[1] Univ Chinese Acad Sci UCAS, Beijing, Peoples R China
[2] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing, Peoples R China
[3] MediaTek, Singapore, Singapore
基金
中国国家自然科学基金;
关键词
Graph neural networks; Large-scale data; Learnable sampling method;
D O I
10.1016/j.neunet.2023.03.015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the development of graph neural networks, how to handle large-scale graph data has become an increasingly important topic. Currently, most graph neural network models which can be extended to large-scale graphs are based on random sampling methods. However, the sampling process in these models is detached from the forward propagation of neural networks. Moreover, quite a few works design sampling based on statistical estimation methods for graph convolutional networks and the weights of message passing in GCNs nodes are fixed, making these sampling methods not scalable to message passing networks with variable weights, such as graph attention networks. Noting the end-to-end learning capability of neural networks, we propose a learnable sampling method. It solves the problem that random sampling operations cannot calculate gradients and samples nodes with an unfixed probability. In this way, the sampling process is dynamically combined with the forward propagation process of the features, allowing for better training of the networks. And it can be generalized to all message passing models. In addition, we apply the learnable sampling method to GNNs and propose two models. Our method can be flexibly combined with different graph neural network models and achieves excellent accuracy on benchmark datasets with large graphs. Meanwhile, loss function converges to smaller values at a faster rate during training than past methods. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页码:412 / 424
页数:13
相关论文
共 50 条
  • [21] Efficient Non-Sampling Graph Neural Networks
    Ji, Jianchao
    Li, Zelong
    Xu, Shuyuan
    Ge, Yingqiang
    Tan, Juntao
    Zhang, Yongfeng
    [J]. INFORMATION, 2023, 14 (08)
  • [22] Bayesian Graph Neural Networks with Adaptive Connection Sampling
    Hasanzadeh, Arman
    Hajiramezanali, Ehsan
    Boluki, Shahin
    Zhou, Mingyuan
    Duffield, Nick
    Narayanan, Krishna
    Qian, Xiaoning
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [23] Scalable algorithms for physics-informed neural and graph networks
    Shukla, Khemraj
    Xu, Mengjia
    Trask, Nathaniel
    Karniadakis, George E.
    [J]. DATA-CENTRIC ENGINEERING, 2022, 3
  • [24] Scalable Data Parallel Distributed Training for Graph Neural Networks
    Koyama, Sohei
    Tatebe, Osamu
    [J]. 2022 IEEE 36TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS (IPDPSW 2022), 2022, : 699 - 707
  • [25] PropInit: Scalable Inductive Initialization for Heterogeneous Graph Neural Networks
    Adeshina, Soji
    Zhang, Jian
    Kim, Muhyun
    Chen, Min
    Fathony, Rizal
    Vashisht, Advitiya
    Chen, Jia
    Karypis, George
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG), 2022, : 6 - 13
  • [26] Scalable Surface Reconstruction with Delaunay-Graph Neural Networks
    Sulzer, R.
    Landrieu, L.
    Marlet, R.
    Vallet, B.
    [J]. COMPUTER GRAPHICS FORUM, 2021, 40 (05) : 157 - 167
  • [27] Scalable algorithms for physics-informed neural and graph networks
    Shukla, Khemraj
    Xu, Mengjia
    Trask, Nathaniel
    Karniadakis, George E.
    [J]. Data-Centric Engineering, 2022, 3 (06):
  • [28] Most Neural Networks Are Almost Learnable
    Daniely, Amit
    Srebro, Nathan
    Vardi, Gal
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [29] Scalable Power Control/Beamforming in Heterogeneous Wireless Networks with Graph Neural Networks
    Zhang, Xiaochen
    Zhao, Haitao
    Xiong, Jun
    Liu, Xiaoran
    Zhou, Li
    Wei, Jibo
    [J]. 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [30] LeL-GNN: Learnable Edge Sampling and Line Based Graph Neural Network for Link Prediction
    Morshed, Md Golam
    Sultana, Tangina
    Lee, Young-Koo
    [J]. IEEE ACCESS, 2023, 11 : 56083 - 56097