Boosting semi-supervised network representation learning with pseudo-multitasking

被引:0
|
作者
Biao Wang
Zhen Dai
Deshun Kong
Lanlan Yu
Jin Zheng
Ping Li
机构
[1] Southwest Petroleum University,Center for Intelligent and Networked Systems, School of Computer Science
[2] Chinese Research Academy of Eniveronmental Sciences,Institute of Environmental Information
[3] Mianyang Subbranch of Industrial and Commercial Bank of China,School of Information
[4] Huawei Nanjing Research Institute,undefined
[5] Southwest Petroleum University,undefined
来源
Applied Intelligence | 2022年 / 52卷
关键词
Multi-task model; Semi-supervised; Network representation learning; Node classification;
D O I
暂无
中图分类号
学科分类号
摘要
Semi-supervised network representation learning is becoming a hotspot in graph mining community, which aims to learn low-dimensional vector representations of vertices using partial label information. In particular, graph neural networks integrate structural information and other side information like vertex attributes to learn node representations. Although the existing semi-supervised graph learning performs well on limited labeled data, it is still often hampered when labeled dataset is quite small. To mitigate this issue, we propose PMNRL, a pseudo-multitask learning framework for semi-supervised network representation learning to boost the expression power of graph networks such as vanilla GCN (Graph Convolutional Networks) and GAT (Graph Attention Networks). In PMNRL, by leveraging the community structures in networks, we create a pseudo task that classifies nodes’ community affiliation, and conduct a joint learning of two tasks (i.e., the original task and the pseudo task). Our proposed scheme can take advantage of the inherent connection between structural proximity and label similarity to improve the performance without the need to resort to more labels. The proposed framework is implemented in two ways: two-stage method and end-to-end method. For two-stage method, communities are first detected and then the community affiliations are used as “labels” along with original labels to train the joint model. In end-to-end method, the unsupervised community learning is combined into the representation learning process by shared layers and task-specific layers, so as to encourage the common features and specific features for different tasks at the same time. The experimental results on three real-world benchmark networks demonstrate the performance improvement of the vanilla models using our framework without any additional labels, especially when there are quite few labels.
引用
收藏
页码:8118 / 8133
页数:15
相关论文
共 50 条
  • [31] MarginMatch: Improving Semi-Supervised Learning with Pseudo-Margins
    Sosea, Tiberiu
    Caragea, Cornelia
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 15773 - 15782
  • [32] Semi-Supervised Learning of Semantic Correspondence with Pseudo-Labels
    Kim, Jiwon
    Ryoo, Kwangrok
    Seo, Junyoung
    Lee, Gyuseong
    Kim, Daehwan
    Cho, Hansang
    Kim, Seungryong
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 19667 - 19677
  • [33] Boosting Semi-Supervised Learning with Dual-Threshold Screening and Similarity Learning
    Liang, Zechen
    Wang, Yuan-gen
    Lu, Wei
    Cao, Xiaochun
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (09)
  • [34] Semi-supervised Learning
    Adams, Niall
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES A-STATISTICS IN SOCIETY, 2009, 172 : 530 - 530
  • [35] On semi-supervised learning
    A. Cholaquidis
    R. Fraiman
    M. Sued
    TEST, 2020, 29 : 914 - 937
  • [36] On semi-supervised learning
    Cholaquidis, A.
    Fraiman, R.
    Sued, M.
    TEST, 2020, 29 (04) : 914 - 937
  • [37] Water Supply Clusters based on a Boosting Semi-Supervised Learning Methodology
    Herrera, M.
    Izquierdo, J.
    Perez-Garcia, R.
    Montalvo, I.
    PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY, 2010, 94
  • [38] Deep data representation with feature propagation for semi-supervised learning
    Dornaika, F.
    Hoang, V. Truong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (04) : 1303 - 1316
  • [39] Deep data representation with feature propagation for semi-supervised learning
    F. Dornaika
    V. Truong Hoang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1303 - 1316
  • [40] TRAINING BOOSTING-LIKE ALGORITHMS WITH SEMI-SUPERVISED SUBSPACE LEARNING
    Xu, Jingsong
    Wu, Qiang
    Zhang, Jian
    Shen, Fumin
    Tang, Zhenmin
    2013 20TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP 2013), 2013, : 4302 - 4306