Boosting semi-supervised network representation learning with pseudo-multitasking

被引:0
|
作者
Biao Wang
Zhen Dai
Deshun Kong
Lanlan Yu
Jin Zheng
Ping Li
机构
[1] Southwest Petroleum University,Center for Intelligent and Networked Systems, School of Computer Science
[2] Chinese Research Academy of Eniveronmental Sciences,Institute of Environmental Information
[3] Mianyang Subbranch of Industrial and Commercial Bank of China,School of Information
[4] Huawei Nanjing Research Institute,undefined
[5] Southwest Petroleum University,undefined
来源
Applied Intelligence | 2022年 / 52卷
关键词
Multi-task model; Semi-supervised; Network representation learning; Node classification;
D O I
暂无
中图分类号
学科分类号
摘要
Semi-supervised network representation learning is becoming a hotspot in graph mining community, which aims to learn low-dimensional vector representations of vertices using partial label information. In particular, graph neural networks integrate structural information and other side information like vertex attributes to learn node representations. Although the existing semi-supervised graph learning performs well on limited labeled data, it is still often hampered when labeled dataset is quite small. To mitigate this issue, we propose PMNRL, a pseudo-multitask learning framework for semi-supervised network representation learning to boost the expression power of graph networks such as vanilla GCN (Graph Convolutional Networks) and GAT (Graph Attention Networks). In PMNRL, by leveraging the community structures in networks, we create a pseudo task that classifies nodes’ community affiliation, and conduct a joint learning of two tasks (i.e., the original task and the pseudo task). Our proposed scheme can take advantage of the inherent connection between structural proximity and label similarity to improve the performance without the need to resort to more labels. The proposed framework is implemented in two ways: two-stage method and end-to-end method. For two-stage method, communities are first detected and then the community affiliations are used as “labels” along with original labels to train the joint model. In end-to-end method, the unsupervised community learning is combined into the representation learning process by shared layers and task-specific layers, so as to encourage the common features and specific features for different tasks at the same time. The experimental results on three real-world benchmark networks demonstrate the performance improvement of the vanilla models using our framework without any additional labels, especially when there are quite few labels.
引用
收藏
页码:8118 / 8133
页数:15
相关论文
共 50 条
  • [21] FairSwiRL: fair semi-supervised classification with representation learning
    Shuyi Yang
    Mattia Cerrato
    Dino Ienco
    Ruggero G. Pensa
    Roberto Esposito
    Machine Learning, 2023, 112 : 3051 - 3076
  • [22] SEMI-SUPERVISED METRIC LEARNING VIA TOPOLOGY REPRESENTATION
    Wang, Q. Y.
    Yuen, P. C.
    Feng, G. C.
    2012 PROCEEDINGS OF THE 20TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2012, : 639 - 643
  • [23] SIMILARITY LEARNING FOR SEMI-SUPERVISED MULTI-CLASS BOOSTING
    Wang, Q. Y.
    Yuen, P. C.
    Feng, G. C.
    2011 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2011, : 2164 - 2167
  • [24] Semi-Supervised Visual Representation Learning for Fashion Compatibility
    Revanur, Ambareesh
    Kumar, Vijay
    Sharma, Deepthi
    15TH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS 2021), 2021, : 463 - 472
  • [25] Duplicate Image Representation Based on Semi-Supervised Learning
    Chen, Ming
    Yan, Jinghua
    Gao, Tieliang
    Li, Yuhua
    Ma, Huan
    INTERNATIONAL JOURNAL OF GRID AND HIGH PERFORMANCE COMPUTING, 2022, 14 (01)
  • [26] HOLISTIC SEMI-SUPERVISED APPROACHES FOR EEG REPRESENTATION LEARNING
    Zhang, Guangyi
    Etemad, Ali
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 1241 - 1245
  • [27] FairSwiRL: fair semi-supervised classification with representation learning
    Yang, Shuyi
    Cerrato, Mattia
    Ienco, Dino
    Pensa, Ruggero G.
    Esposito, Roberto
    MACHINE LEARNING, 2023, 112 (09) : 3051 - 3076
  • [28] A Neural Network for Semi-supervised Learning on Manifolds
    Genkin, Alexander
    Sengupta, Anirvan M.
    Chklovskii, Dmitri
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: THEORETICAL NEURAL COMPUTATION, PT I, 2019, 11727 : 375 - 386
  • [29] Pseudo Contrastive Learning for graph-based semi-supervised learning
    Lu, Weigang
    Guan, Ziyu
    Zhao, Wei
    Yang, Yaming
    Lv, Yuanhai
    Xing, Lining
    Yu, Baosheng
    Tao, Dacheng
    NEUROCOMPUTING, 2025, 624
  • [30] Pseudo-label semi-supervised learning for soybean monitoring
    Menezes, Gabriel Kirsten
    Astolfi, Gilberto
    Martins, Jose Augusto Correa
    Tetila, Everton Castelao
    Oliveira Jr, Adair da Silva
    Goncalves, Diogo Nunes
    Marcato Jr, Jose
    Silva, Jonathan Andrade
    Li, Jonathan
    Goncalves, Wesley Nunes
    Pistori, Hemerson
    SMART AGRICULTURAL TECHNOLOGY, 2023, 4