GraphMix: Improved Training of GNNs for Semi-Supervised Learning

被引:0
|
作者
Verma, Vikas [1 ,2 ]
Qu, Meng [1 ]
Kawaguchi, Kenji [3 ]
Lamb, Alex [1 ]
Bengio, Yoshua [1 ]
Kannala, Juho [2 ]
Tang, Jian [1 ]
机构
[1] Mila Quebec Artificial Intelligence Inst, Montreal, PQ, Canada
[2] Aalto Univ, Espoo, Finland
[3] MIT, Cambridge, MA 02139 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present GraphMix, a regularization method for Graph Neural Network based semi-supervised object classification, whereby we propose to train a fully-connected network jointly with the graph neural network via parameter sharing and interpolation-based regularization. Further, we provide a theoretical analysis of how GraphMix improves the generalization bounds of the underlying graph neural network, without making any assumptions about the "aggregation" layer or the depth of the graph neural networks. We experimentally validate this analysis by applying GraphMix to various architectures such as Graph Convolutional Networks, Graph Attention Networks and Graph-U-Net. Despite its simplicity, we demonstrate that GraphMix can consistently improve or closely match state-of-the-art performance using even simpler architectures such as Graph Convolutional Networks, across three established graph benchmarks: Cora, Citeseer and Pubmed citation network datasets, as well as three newly proposed datasets: Cora-Full, Co-author-CS and Co-author-Physics.
引用
收藏
页码:10024 / 10032
页数:9
相关论文
共 50 条
  • [1] Semi-supervised Learning Based on Improved Co-training by Committee
    Liu, Kun
    Guo, Yuwei
    Wang, Shuang
    Wu, Linsheng
    Yue, Bo
    Hou, Biao
    INTELLIGENCE SCIENCE AND BIG DATA ENGINEERING: BIG DATA AND MACHINE LEARNING TECHNIQUES, ISCIDE 2015, PT II, 2015, 9243 : 413 - 421
  • [2] Generative Adversarial Training for Supervised and Semi-supervised Learning
    Wang, Xianmin
    Li, Jing
    Liu, Qi
    Zhao, Wenpeng
    Li, Zuoyong
    Wang, Wenhao
    FRONTIERS IN NEUROROBOTICS, 2021, 15
  • [3] Manifold adversarial training for supervised and semi-supervised learning
    Zhang, Shufei
    Huang, Kaizhu
    Zhu, Jianke
    Liu, Yang
    NEURAL NETWORKS, 2021, 140 : 282 - 293
  • [4] On the Learning Dynamics of Semi-Supervised Training for ASR
    Wallington, Electra
    Kershenbaum, Benji
    Klejch, Ondrej
    Bell, Peter
    INTERSPEECH 2021, 2021, : 716 - 720
  • [5] Interpolation consistency training for semi-supervised learning
    Verma, Vikas
    Kawaguchi, Kenji
    Lamb, Alex
    Kannala, Juho
    Solin, Arno
    Bengio, Yoshua
    Lopez-Paz, David
    NEURAL NETWORKS, 2022, 145 : 90 - 106
  • [6] Interpolation Consistency Training for Semi-Supervised Learning
    Verma, Vikas
    Lamb, Alex
    Kannala, Juho
    Bengio, Yoshua
    Lopez-Paz, David
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3635 - 3641
  • [7] MarginGAN: Adversarial Training in Semi-Supervised Learning
    Dong, Jinhao
    Lin, Tong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [8] Improved Semi-Supervised Learning with Multiple Graphs
    Viswanathan, Krishnamurthy
    Sachdeva, Sushant
    Tomkins, Andrew
    Ravi, Sujith
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [9] FICOM: an effective and scalable active learning framework for GNNs on semi-supervised node classification
    Zhang, Xingyi
    Huang, Jinchao
    Zhang, Fangyuan
    Wang, Sibo
    VLDB JOURNAL, 2024, 33 (05): : 1723 - 1742
  • [10] DMT: Dynamic mutual training for semi-supervised learning
    Feng, Zhengyang
    Zhou, Qianyu
    Gu, Qiqi
    Tan, Xin
    Cheng, Guangliang
    Lu, Xuequan
    Shi, Jianping
    Ma, Lizhuang
    PATTERN RECOGNITION, 2022, 130