A Unified Generative Adversarial Learning Framework for Improvement of Skip-Gram Network Representation Learning Methods

被引:4
|
作者
Wu, Peng [1 ]
Zheng, Conghui [1 ]
Pan, Li [1 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Elect Informat & Elect Engn, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金;
关键词
Network representation learning; generative adversarial nets; network embedding; deep learning;
D O I
10.1109/TKDE.2021.3076766
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Network Representation Learning (NRL), which aims to embed nodes into a latent, low-dimensional vector space while preserving some network properties, facilitates the further network analysis tasks. The goal of most NRL methods is to make similar nodes represented similarly in the embedding space. Many methods adopt the skip-gram model to achieve such goal by maximizing the predictive probability among the context nodes for each center node. The context nodes are usually determined based on the concept of proximity which is defined based on some explicit network features. However, these proximities may result in a loss of training samples and have limited discriminative power. We propose a general and unified generative adversarial learning framework to address the problems. The proposed framework can handle almost all kinds of networks in a unified way, including homogeneous plain networks, attribute augmented networks and heterogeneous networks. It can improve the performances of the most of the state-of-the-art skip-gram based NRL methods. Moreover, another unified and general NRL method is extended from the framework. It can learn the network representation independently. Extensive experiments on proximity preserving evaluation and two network analysis tasks, i.e., link prediction and node classifications, demonstrate the superiority and versatility of our framework.
引用
收藏
页码:45 / 58
页数:14
相关论文
共 50 条
  • [1] Toward an Adaptive Skip-Gram Model for Network Representation Learning
    Hsieh, I-Chung
    Li, Cheng-Te
    IEEE ACCESS, 2022, 10 : 37506 - 37514
  • [2] Distributed representation learning with skip-gram model for trained random forests
    Ma, Chao
    Wang, Tianjun
    Zhang, Le
    Cao, Zhiguang
    Huang, Yue
    Ding, Xinghao
    NEUROCOMPUTING, 2023, 551
  • [3] An Analysis on the Learning Rules of the Skip-Gram Model
    Zhang, Canlin
    Liu, Xiuwen
    Bis, Daniel
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [4] Feature Learning of Virus Genome Evolution With the Nucleotide Skip-Gram Neural Network
    Shim, Hyunjin
    EVOLUTIONARY BIOINFORMATICS, 2019, 15
  • [5] Deep neural network representation and Generative Adversarial Learning
    Ruiz-Garcia, Ariel
    Schmidhuber, Jurgen
    Palade, Vasile
    Took, Clive Cheong
    Mandic, Danilo
    NEURAL NETWORKS, 2021, 139 : 199 - 200
  • [6] Improvement of Learning Stability of Generative Adversarial Network Using Variational Learning
    Lee, Je-Yeol
    Choi, Sang-Il
    APPLIED SCIENCES-BASEL, 2020, 10 (13):
  • [7] Time-varying graph representation learning via higher-order skip-gram with negative sampling
    Piaggesi, Simone
    Panisson, Andre
    EPJ DATA SCIENCE, 2022, 11 (01)
  • [8] Time-varying graph representation learning via higher-order skip-gram with negative sampling
    Simone Piaggesi
    André Panisson
    EPJ Data Science, 11
  • [9] A Unified Framework for Community Detection and Network Representation Learning
    Tu, Cunchao
    Zeng, Xiangkai
    Wang, Hao
    Zhang, Zhengyan
    Liu, Zhiyuan
    Sun, Maosong
    Zhang, Bo
    Lin, Leyu
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (06) : 1051 - 1065
  • [10] Learning Graph Representation With Generative Adversarial Nets
    Wang, Hongwei
    Wang, Jialin
    Wang, Jia
    Zhao, Miao
    Zhang, Weinan
    Zhang, Fuzheng
    Li, Wenjie
    Xie, Xing
    Guo, Minyi
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2021, 33 (08) : 3090 - 3103