Efficient Network Representation Learning via Cluster Similarity

被引:0
|
作者
Yasuhiro Fujiwara
Yasutoshi Ida
Atsutoshi Kumagai
Masahiro Nakano
Akisato Kimura
Naonori Ueda
机构
[1] NTT Communication Science Labortories,
来源
关键词
Efficient; Algorithm; Network representation learning; Graph clustering;
D O I
暂无
中图分类号
学科分类号
摘要
Network representation learning is a de facto tool for graph analytics. The mainstream of the previous approaches is to factorize the proximity matrix between nodes. However, if n is the number of nodes, since the size of the proximity matrix is n×n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n \times n$$\end{document}, it needs O(n3)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O(n^3)$$\end{document} time and O(n2)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O(n^2)$$\end{document} space to perform network representation learning; they are significantly high for large-scale graphs. This paper introduces the novel idea of using similarities between clusters instead of proximities between nodes; the proposed approach computes the representations of the clusters from similarities between clusters and computes the representations of nodes by referring to them. If l is the number of clusters, since l≪n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l \ll n$$\end{document}, we can efficiently obtain the representations of clusters from a small l×l\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l \times l$$\end{document} similarity matrix. Furthermore, since nodes in each cluster share similar structural properties, we can effectively compute the representation vectors of nodes. Experiments show that our approach can perform network representation learning more efficiently and effectively than existing approaches.
引用
收藏
页码:279 / 291
页数:12
相关论文
共 50 条
  • [11] Target Aware Network Adaptation for Efficient Representation Learning
    Zhong, Yang
    Li, Vladimir
    Okada, Ryuzo
    Maki, Atsuto
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT IV, 2019, 11132 : 450 - 467
  • [12] Efficient Representation Learning via Adaptive Context Pooling
    Huang, Chen
    Talbott, Walter
    Jaitly, Navdeep
    Susskind, Josh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [13] Region Similarity Representation Learning
    Xiao, Tete
    Reed, Colorado J.
    Wang, Xiaolong
    Keutzer, Kurt
    Darrell, Trevor
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10519 - 10528
  • [14] Local similarity preserved hashing learning via Markov graph for efficient similarity search
    Liu, Hong
    Jiang, Aiwen
    Wang, Mingwen
    Wan, Jianyi
    NEUROCOMPUTING, 2015, 159 : 144 - 150
  • [15] Dynamic Influence Maximization via Network Representation Learning
    Sheng, Wei
    Song, Wenbo
    Li, Dong
    Yang, Fei
    Zhang, Yatao
    FRONTIERS IN PHYSICS, 2022, 9
  • [16] Learning flexible network representation via anonymous walks
    Wang, Yu
    Hu, Liang
    Gao, Wanfu
    KNOWLEDGE-BASED SYSTEMS, 2021, 222
  • [17] Denoising cosine similarity: A theory-driven approach for efficient representation learning
    Nakagawa, Takumi
    Sanada, Yutaro
    Waida, Hiroki
    Zhang, Yuhui
    Wada, Yuichiro
    Takanashi, Kosaku
    Yamada, Tomonori
    Kanamori, Takafumi
    NEURAL NETWORKS, 2024, 169 : 226 - 241
  • [18] Prototype Similarity Distillation for Communication-Efficient Federated Unsupervised Representation Learning
    Zhang, Chen
    Xie, Yu
    Chen, Tingbin
    Mao, Wenjie
    Yu, Bin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6865 - 6876
  • [19] Learning Network Representation via Ego-Network-Level Relationship
    Yan, Bencheng
    Huang, Shenglei
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT IV, 2019, 1142 : 414 - 422
  • [20] Network Completion via Joint Node Clustering and Similarity Learning
    Rafailidis, Dimitrios
    Crestani, Fabio
    PROCEEDINGS OF THE 2016 IEEE/ACM INTERNATIONAL CONFERENCE ON ADVANCES IN SOCIAL NETWORKS ANALYSIS AND MINING ASONAM 2016, 2016, : 63 - 68