Boosting semi-supervised network representation learning with pseudo-multitasking

被引:0
|
作者
Biao Wang
Zhen Dai
Deshun Kong
Lanlan Yu
Jin Zheng
Ping Li
机构
[1] Southwest Petroleum University,Center for Intelligent and Networked Systems, School of Computer Science
[2] Chinese Research Academy of Eniveronmental Sciences,Institute of Environmental Information
[3] Mianyang Subbranch of Industrial and Commercial Bank of China,School of Information
[4] Huawei Nanjing Research Institute,undefined
[5] Southwest Petroleum University,undefined
来源
Applied Intelligence | 2022年 / 52卷
关键词
Multi-task model; Semi-supervised; Network representation learning; Node classification;
D O I
暂无
中图分类号
学科分类号
摘要
Semi-supervised network representation learning is becoming a hotspot in graph mining community, which aims to learn low-dimensional vector representations of vertices using partial label information. In particular, graph neural networks integrate structural information and other side information like vertex attributes to learn node representations. Although the existing semi-supervised graph learning performs well on limited labeled data, it is still often hampered when labeled dataset is quite small. To mitigate this issue, we propose PMNRL, a pseudo-multitask learning framework for semi-supervised network representation learning to boost the expression power of graph networks such as vanilla GCN (Graph Convolutional Networks) and GAT (Graph Attention Networks). In PMNRL, by leveraging the community structures in networks, we create a pseudo task that classifies nodes’ community affiliation, and conduct a joint learning of two tasks (i.e., the original task and the pseudo task). Our proposed scheme can take advantage of the inherent connection between structural proximity and label similarity to improve the performance without the need to resort to more labels. The proposed framework is implemented in two ways: two-stage method and end-to-end method. For two-stage method, communities are first detected and then the community affiliations are used as “labels” along with original labels to train the joint model. In end-to-end method, the unsupervised community learning is combined into the representation learning process by shared layers and task-specific layers, so as to encourage the common features and specific features for different tasks at the same time. The experimental results on three real-world benchmark networks demonstrate the performance improvement of the vanilla models using our framework without any additional labels, especially when there are quite few labels.
引用
收藏
页码:8118 / 8133
页数:15
相关论文
共 50 条
  • [1] Boosting semi-supervised network representation learning with pseudo-multitasking
    Wang, Biao
    Dai, Zhen
    Kong, Deshun
    Yu, Lanlan
    Zheng, Jin
    Li, Ping
    APPLIED INTELLIGENCE, 2022, 52 (07) : 8118 - 8133
  • [2] FlexMatch: Boosting Semi-Supervised Learning with Curriculum Pseudo Labeling
    Zhang, Bowen
    Wang, Yidong
    Hou, Wenxin
    Wu, Hao
    Wang, Jindong
    Okumura, Manabu
    Shinozaki, Takahiro
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] Similarity Learning Based on Sparse Representation for Semi-Supervised Boosting
    Wang, Qianying
    Lu, Ming
    Li, Junhong
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2018, 17 (02)
  • [4] SemiBoost: Boosting for Semi-Supervised Learning
    Mallapragada, Pavan Kumar
    Jin, Rong
    Jain, Anil K.
    Liu, Yi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2009, 31 (11) : 2000 - 2014
  • [5] Boosting for multiclass semi-supervised learning
    Tanha, Jafar
    van Someren, Maarten
    Afsarmanesh, Hamideh
    PATTERN RECOGNITION LETTERS, 2014, 37 : 63 - 77
  • [6] Hierarchical Attention Based Semi-supervised Network Representation Learning
    Liu, Jie
    Deng, Junyi
    Xu, Guanghui
    He, Zhicheng
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, 2018, 11108 : 237 - 249
  • [7] Learning Semi-Supervised Representation Towards a Unified Optimization Framework for Semi-Supervised Learning
    Li, Chun-Guang
    Lin, Zhouchen
    Zhang, Honggang
    Guo, Jun
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 2767 - 2775
  • [8] Semi-supervised learning by sparse representation
    Yan, Shuicheng
    Wang, Huan
    Society for Industrial and Applied Mathematics - 9th SIAM International Conference on Data Mining 2009, Proceedings in Applied Mathematics, 2009, 2 : 788 - 797
  • [9] Semi-Supervised Learning via Regularized Boosting Working on Multiple Semi-Supervised Assumptions
    Chen, Ke
    Wang, Shihai
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (01) : 129 - 143
  • [10] Boosting mixture models for semi-supervised learning
    Grandvalet, Y
    d'Alché-Buc, F
    Ambroise, C
    ARTIFICIAL NEURAL NETWORKS-ICANN 2001, PROCEEDINGS, 2001, 2130 : 41 - 48