Self-Paced Co-Training of Graph Neural Networks for Semi-Supervised Node Classification

被引:22
|
作者
Gong, Maoguo [1 ]
Zhou, Hui [1 ]
Qin, A. K. [2 ]
Liu, Wenfeng [1 ]
Zhao, Zhongying [3 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710071, Peoples R China
[2] Swinburne Univ Technol, Dept Comp Technol, Melbourne, Vic 3122, Australia
[3] Shandong Univ Sci & Technol, Sch Comp Sci & Engn, Qingdao 266590, Peoples R China
基金
澳大利亚研究理事会; 中国国家自然科学基金;
关键词
Training; Data models; Task analysis; Graph neural networks; Training data; Predictive models; Optimization; Co-training; graph neural networks (GNNs); node classification; self-paced learning (SPL); semi-supervised learning (SSL); COMMUNITY DETECTION;
D O I
10.1109/TNNLS.2022.3157688
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have demonstrated great success in many graph data-based applications. The impressive behavior of GNNs typically relies on the availability of a sufficient amount of labeled data for model training. However, in practice, obtaining a large number of annotations is prohibitively labor-intensive and even impossible. Co-training is a popular semi-supervised learning (SSL) paradigm, which trains multiple models based on a common training set while augmenting the limited amount of labeled data used for training each model via the pseudolabeled data generated from the prediction results of other models. Most of the existing co-training works do not control the quality of pseudolabeled data when using them. Therefore, the inaccurate pseudolabels generated by immature models in the early stage of the training process are likely to cause noticeable errors when they are used for augmenting the training data for other models. To address this issue, we propose a self-paced co-training for the GNN (SPC-GNN) framework for semi-supervised node classification. This framework trains multiple GNNs with the same or different structures on different representations of the same training data. Each GNN carries out SSL by using both the originally available labeled data and the augmented pseudolabeled data generated from other GNNs. To control the quality of pseudolabels, a self-paced label augmentation strategy is designed to make the pseudolabels generated at a higher confidence level to be utilized earlier during training such that the negative impact of inaccurate pseudolabels on training data augmentation, and accordingly, the subsequent training process can be mitigated. Finally, each of the trained GNN is evaluated on a validation set, and the best-performing one is chosen as the output. To improve the training effectiveness of the framework, we devise a pretraining followed by a two-step optimization scheme to train GNNs. Experimental results on the node classification task demonstrate that the proposed framework achieves significant improvement over the state-of-the-art SSL methods.
引用
收藏
页码:9234 / 9247
页数:14
相关论文
共 50 条
  • [41] SEMI-SUPERVISED PYRAMID FEATURE CO-TRAINING NETWORK FOR LIDAR DATA CLASSIFICATION
    Wang, Zexin
    Wang, Haoran
    Jiao, Licheng
    Liu, Xu
    2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2019), 2019, : 2471 - 2474
  • [42] Deep co-training for semi-supervised image segmentation
    Peng, Jizong
    Estrada, Guillermo
    Pedersoli, Marco
    Desrosiers, Christian
    PATTERN RECOGNITION, 2020, 107 (107)
  • [43] Using co-training and self-training in semi-supervised multiple classifier systems
    Didaci, Luca
    Roli, Fabio
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, PROCEEDINGS, 2006, 4109 : 522 - 530
  • [44] Semi-supervised Learning for Regression with Co-training by Committee
    Hady, Mohamed Farouk Abdel
    Schwenker, Friedhelm
    Palm, Guenther
    ARTIFICIAL NEURAL NETWORKS - ICANN 2009, PT I, 2009, 5768 : 121 - 130
  • [45] Deep Co-Training for Semi-Supervised Image Recognition
    Qiao, Siyuan
    Shen, Wei
    Zhang, Zhishuai
    Wang, Bo
    Yuille, Alan
    COMPUTER VISION - ECCV 2018, PT 15, 2018, 11219 : 142 - 159
  • [46] Semi-supervised node classification via graph learning convolutional neural network
    Kangjie Li
    Wenjing Ye
    Applied Intelligence, 2022, 52 : 12724 - 12736
  • [47] A Deep Graph Wavelet Convolutional Neural Network for Semi-supervised Node Classification
    Wang, Jingyi
    Deng, Zhidong
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [48] Label-Enhanced Graph Neural Network for Semi-Supervised Node Classification
    Yu, Le
    Sun, Leilei
    Du, Bowen
    Zhu, Tongyu
    Lv, Weifeng
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (11) : 11529 - 11540
  • [49] Semi-supervised node classification via graph learning convolutional neural network
    Li, Kangjie
    Ye, Wenjing
    APPLIED INTELLIGENCE, 2022, 52 (11) : 12724 - 12736
  • [50] GRNN: Graph-Retraining Neural Network for Semi-Supervised Node Classification
    Li, Jianhe
    Fan, Suohai
    ALGORITHMS, 2023, 16 (03)