Signed Graph Metric Learning via Gershgorin Disc Perfect Alignment

被引:7
|
作者
Yang, Cheng [1 ]
Cheung, Gene [2 ]
Hu, Wei [3 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai 200240, Peoples R China
[2] York Univ, Toronto, ON M3J 1P3, Canada
[3] Peking Univ, Beijing 100871, Peoples R China
基金
加拿大自然科学与工程研究理事会; 中国国家自然科学基金; 中国博士后科学基金;
关键词
Graph signal processing; metric learning; Gershgorin circle theorem; convex optimization; DISTANCE;
D O I
10.1109/TPAMI.2021.3091682
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Given a convex and differentiable objective Q(M) for a real symmetric matrix M in the positive definite (PD) cone-used to compute Mahalanobis distances-we propose a fast general metric learning framework that is entirely projection-free. We first assume that M resides in a space S of generalized graph Laplacian matrices corresponding to balanced signed graphs. M is an element of S that is also PD is called a graph metric matrix. Unlike low-rank metric matrices common in the literature, S includes the important diagonal-only matrices as a special case. The key theorem to circumvent full eigen-decomposition and enable fast metric matrix optimization is Gershgorin disc perfect alignment (GDPA): given M is an element of S and diagonal matrix S, where S-ii = 1/(v)(i) and v is the first eigenvector of M, we prove that Gershgorin disc left-ends of similarity transform B = SMS-1 are perfectly aligned at the smallest eigenvalue lambda(min). Using this theorem, we replace the PD cone constraint in the metric learning problem with tightest possible linear constraints per iteration, so that the alternating optimization of the diagonal / off-diagonal terms in M can be solved efficiently as linear programs via the Frank-Wolfe method. We update v using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as entries in M are optimized successively. Experiments show that our graph metric optimization is significantly faster than cone-projection schemes, and produces competitive binary classification performance.
引用
收藏
页码:7219 / 7234
页数:16
相关论文
共 50 条
  • [41] On Euclidean Metric Approximation via Graph Cuts
    Danek, Ondrej
    Matula, Pavel
    COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS: THEORY AND APPLICATIONS, 2011, 229 : 125 - 134
  • [42] Deep Network Embedding for Graph Representation Learning in Signed Networks
    Shen, Xiao
    Chung, Fu-Lai
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (04) : 1556 - 1568
  • [43] Learning Weight Signed Network Embedding with Graph Neural Networks
    Zekun Lu
    Qiancheng Yu
    Xia Li
    Xiaoning Li
    Qinwen Yang
    Data Science and Engineering, 2023, 8 : 36 - 46
  • [44] Whole-Graph Representation Learning for the Classification of Signed Networks
    Cecillon, Noe
    Labatut, Vincent
    Dufour, Richard
    Arinik, Nejat
    IEEE ACCESS, 2024, 12 : 151303 - 151316
  • [45] Learning Weight Signed Network Embedding with Graph Neural Networks
    Lu, Zekun
    Yu, Qiancheng
    Li, Xia
    Li, Xiaoning
    Yang, Qinwen
    DATA SCIENCE AND ENGINEERING, 2023, 8 (01) : 36 - 46
  • [46] Learning cognitive embedding using signed knowledge interaction graph
    Huo, Yujia
    Wong, Derek F.
    Ni, Lionel M.
    Chao, Lidia S.
    Zhang, Jing
    Zuo, Xin
    KNOWLEDGE-BASED SYSTEMS, 2021, 229
  • [47] Iterative Graph Alignment via Supermodular Approximation
    Konar, Aritra
    Sidiropoulos, Nicholas D.
    2019 19TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2019), 2019, : 1162 - 1167
  • [48] Distance metric learning for graph structured data
    Yoshida, Tomoki
    Takeuchi, Ichiro
    Karasuyama, Masayuki
    MACHINE LEARNING, 2021, 110 (07) : 1765 - 1811
  • [49] Distance metric learning for graph structured data
    Tomoki Yoshida
    Ichiro Takeuchi
    Masayuki Karasuyama
    Machine Learning, 2021, 110 : 1765 - 1811
  • [50] Tree Structure-Aware Graph Representation Learning via Integrated Hierarchical Aggregation and Relational Metric Learning
    Qiao, Ziyue
    Wang, Pengyang
    Fu, Yanjie
    Du, Yi
    Wang, Pengfei
    Zhou, Yuanchun
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 432 - 441