Node Embeddings and Exact Low-Rank Representations of Complex Networks

被引:0
|
作者
Chanpuriya, Sudhanshu [1 ]
Musco, Cameron [1 ]
Tsourakakis, Charalampos E. [2 ,3 ]
Sotiropoulos, Konstantinos [2 ]
机构
[1] Univ Massachusetts, Amherst, MA 01003 USA
[2] Boston Univ, Boston, MA USA
[3] ISI Fdn, Turin, Italy
关键词
DIMENSIONALITY REDUCTION; ALGORITHMS; GRAPHS; CUT;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-dimensional embeddings, from classical spectral embeddings to modern neural-net-inspired methods, are a cornerstone in the modeling and analysis of complex networks. Recent work by Seshadhri et al. (PNAS 2020) suggests that such embeddings cannot capture local structure arising in complex networks. In particular, they show that any network generated from a natural low-dimensional model cannot be both sparse and have high triangle density (high clustering coefficient), two hallmark properties of many real-world networks. In this work we show that the results of Seshadhri et al. are intimately connected to the model they use rather than the low-dimensional structure of complex networks. Specifically, we prove that a minor relaxation of their model can generate sparse graphs with high triangle density. Surprisingly, we show that this same model leads to exact low-dimensional factorizations of many real-world networks. We give a simple algorithm based on logistic principal component analysis (LPCA) that succeeds in finding such exact embeddings. Finally, we perform a large number of experiments that verify the ability of very low-dimensional embeddings to capture local structure in real-world networks.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] On Polynomial Time Methods for Exact Low-Rank Tensor Completion
    Xia, Dong
    Yuan, Ming
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2019, 19 (06) : 1265 - 1313
  • [32] EXACT RECOVERY OF LOW-RANK PLUS COMPRESSED SPARSE MATRICES
    Mardani, Morteza
    Mateos, Gonzalo
    Giannakis, Georgios B.
    2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 49 - 52
  • [33] Learning Edge Representations via Low-Rank Asymmetric Projections
    Abu-El-Haija, Sami
    Perozzi, Bryan
    Al-Rfou, Rami
    CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 1787 - 1796
  • [34] Efficient SVM training using low-rank kernel representations
    Fine, S
    Scheinberg, K
    JOURNAL OF MACHINE LEARNING RESEARCH, 2002, 2 (02) : 243 - 264
  • [35] Robust low-rank image representations by deep matrix decompositions
    Yang, Chenxue
    Ye, Mao
    Li, Xudong
    Liu, Zijian
    Tang, Song
    Li, Tao
    ELECTRONICS LETTERS, 2014, 50 (24) : 1843 - U209
  • [36] IMPROVING MULTIFRONTAL METHODS BY MEANS OF BLOCK LOW-RANK REPRESENTATIONS
    Amestoy, Patrick
    Ashcraft, Cleve
    Boiteau, Olivier
    Buttari, Alfredo
    L'Excellent, Jean-Yves
    Weisbecker, Clement
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2015, 37 (03): : A1451 - A1474
  • [37] Exact recovery low-rank matrix via transformed affine matrix rank minimization
    Cui, Angang
    Peng, Jigen
    Li, Haiyang
    NEUROCOMPUTING, 2018, 319 : 1 - 12
  • [38] Low-rank lottery tickets: finding efficient low-rank neural networks via matrix differential equations
    Schotthoefer, Steffen
    Zangrando, Emanuele
    Kusch, Jonas
    Ceruti, Gianluca
    Tudisco, Francesco
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [39] Neural graph embeddings as explicit low-rank matrix factorization for link prediction
    Agibetov, Asan
    PATTERN RECOGNITION, 2023, 133
  • [40] Learning low-rank latent mesoscale structures in networks
    Hanbaek Lyu
    Yacoub H. Kureh
    Joshua Vendrow
    Mason A. Porter
    Nature Communications, 15