Low-Rank Extragradient Method for Nonsmooth and Low-Rank Matrix Optimization Problems

被引:0
|
作者
Garber, Dan [1 ]
Kaplan, Atara [1 ]
机构
[1] Technion Israel Inst Technol, IL-3200003 Haifa, Israel
基金
以色列科学基金会;
关键词
BOUNDS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank and nonsmooth matrix optimization problems capture many fundamental tasks in statistics and machine learning. While significant progress has been made in recent years in developing efficient methods for smooth low-rank optimization problems that avoid maintaining high-rank matrices and computing expensive high-rank SVDs, advances for nonsmooth problems have been slow paced. In this paper we consider standard convex relaxations for such problems. Mainly, we prove that under a natural generalized strict complementarity condition and under the relatively mild assumption that the nonsmooth objective can be written as a maximum of smooth functions, the extragradient method, when initialized with a "warm-start" point, converges to an optimal solution with rate O (1/t) while requiring only two low-rank SVDs per iteration. We give a precise trade-off between the rank of the SVDs required and the radius of the ball in which we need to initialize the method. We support our theoretical results with empirical experiments on several nonsmooth low-rank matrix recovery tasks, demonstrating that using simple initializations, the extragradient method produces exactly the same iterates when full-rank SVDs are replaced with SVDs of rank that matches the rank of the (low-rank) ground-truth matrix to be recovered.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Modifiable low-rank approximation to a matrix
    Barlow, Jesse L.
    Erbay, Hasan
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2009, 16 (10) : 833 - 860
  • [42] Adaptive Low-Rank Matrix Completion
    Tripathi, Ruchi
    Mohan, Boda
    Rajawat, Ketan
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (14) : 3603 - 3616
  • [43] LOW-RANK UPDATES OF MATRIX FUNCTIONS
    Beckermann, Bernhard
    Kressner, Daniel
    Schweitzer, Marcel
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2018, 39 (01) : 539 - 565
  • [44] Fast Gradient Method for Low-Rank Matrix Estimation
    Hongyi Li
    Zhen Peng
    Chengwei Pan
    Di Zhao
    [J]. Journal of Scientific Computing, 2023, 96
  • [45] Fast Gradient Method for Low-Rank Matrix Estimation
    Li, Hongyi
    Peng, Zhen
    Pan, Chengwei
    Zhao, Di
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2023, 96 (02)
  • [46] Low-Rank Solutions of Matrix Inequalities With Applications to Polynomial Optimization and Matrix Completion Problems
    Madani, Ramtin
    Fazelnia, Ghazal
    Sojoudi, Somayeh
    Lavaei, Javad
    [J]. 2014 IEEE 53RD ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2014, : 4328 - 4335
  • [47] A geometric method for eigenvalue problems with low-rank perturbations
    Anastasio, Thomas J.
    Barreiro, Andrea K.
    Bronski, Jared C.
    [J]. ROYAL SOCIETY OPEN SCIENCE, 2017, 4 (09):
  • [48] An exact penalty method for semidefinite-box-constrained low-rank matrix optimization problems
    Liu, Tianxiang
    Lu, Zhaosong
    Chen, Xiaojun
    Dai, Yu-Hong
    [J]. IMA JOURNAL OF NUMERICAL ANALYSIS, 2020, 40 (01) : 563 - 586
  • [49] A new perspective on low-rank optimization
    Dimitris Bertsimas
    Ryan Cory-Wright
    Jean Pauphilet
    [J]. Mathematical Programming, 2023, 202 : 47 - 92
  • [50] Low-rank and sparse matrices fitting algorithm for low-rank representation
    Zhao, Jianxi
    Zhao, Lina
    [J]. COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2020, 79 (02) : 407 - 425