Stability and generalization of graph convolutional networks in eigen-domains

被引:0
|
作者
Ng, Michael K. [1 ,2 ]
Yip, Andy [2 ]
机构
[1] Univ Hong Kong, Inst Data Sci, Pokfulam Rd, Hong Kong, Peoples R China
[2] Univ Hong Kong, Dept Math, Pokfulam Rd, Hong Kong, Peoples R China
关键词
Graph convolutional neural networks; eigenvalues; stability; generalization guarantees;
D O I
10.1142/S0219530523500021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Graph Convolution Networks (GCNs) have been shown to be very effective in utilizing pair-wise relationships across samples. They have been successfully applied to solve various machine learning problems in practice. In many applications, the construction of GCNs involves more than one layer. However, their generalization and stability analysis are limited. The main aim of this paper is to analyze GCNs with two layers. The formulation is based on transductive semi-supervised learning and the filtering is done in the eigen-domain. We show the uniform stability of the neural network and the convergence of the generalization gap to zero. The analysis of two-layer GCN is more involved than the single-layer case and requires some new estimates of the neural network's quantities. The analysis confirms the usefulness of GCNs. It also sheds light on the design of the neural network, for instance, how the data should be scaled to achieve the uniform stability of the learning process. Some experimental results on benchmark datasets are presented to illustrate the theory.
引用
收藏
页码:819 / 840
页数:22
相关论文
共 50 条
  • [1] Stability and Generalization of Graph Convolutional Neural Networks
    Verma, Saurabh
    Zhang, Zhi-Li
    [J]. KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 1539 - 1548
  • [2] Generalization Guarantee of Training Graph Convolutional Networks with Graph Topology Sampling
    Li, Hongkang
    Wang, Meng
    Liu, Sijia
    Chen, Pin-Yu
    Xiong, Jinjun
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [3] On the generalization discrepancy of spatiotemporal dynamics-informed graph convolutional networks
    Sun, Yue
    Chen, Chao
    Xu, Yuesheng
    Xie, Sihong
    Blum, Rick S.
    Venkitasubramaniam, Parv
    [J]. FRONTIERS IN MECHANICAL ENGINEERING-SWITZERLAND, 2024, 10
  • [4] The generalization error of graph convolutional networks may enlarge with more layers
    Zhou, Xianchen
    Wang, Hongxia
    [J]. NEUROCOMPUTING, 2021, 424 : 97 - 106
  • [5] Multi-relational graph convolutional networks: Generalization guarantees and experiments
    Li, Xutao
    Ng, Michael K.
    Xu, Guangning
    Yip, Andy
    [J]. NEURAL NETWORKS, 2023, 161 : 343 - 358
  • [6] Stability of graph convolutional neural networks to stochastic perturbations
    Gao, Zhan
    Isufi, Elvin
    Ribeiro, Alejandro
    [J]. SIGNAL PROCESSING, 2021, 188
  • [7] ON THE STABILITY OF GRAPH CONVOLUTIONAL NEURAL NETWORKS UNDER EDGE REWIRING
    Kenlay, Henry
    Thanou, Dorina
    Dong, Xiaowen
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 8513 - 8517
  • [8] Graph Convolutional Network for Adversarial Domain Generalization
    Zhang, Xiaoqing
    Su, Hao
    Liu, Xuebin
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, : 1 - 9
  • [9] Graph sparsification with graph convolutional networks
    Jiayu Li
    Tianyun Zhang
    Hao Tian
    Shengmin Jin
    Makan Fardad
    Reza Zafarani
    [J]. International Journal of Data Science and Analytics, 2022, 13 : 33 - 46
  • [10] Graph sparsification with graph convolutional networks
    Li, Jiayu
    Zhang, Tianyun
    Tian, Hao
    Jin, Shengmin
    Fardad, Makan
    Zafarani, Reza
    [J]. INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2022, 13 (01) : 33 - 46