PROXIMAL GRADIENT METHOD FOR NONSMOOTH OPTIMIZATION OVER THE STIEFEL MANIFOLD

被引:74
|
作者
Chen, Shixiang [1 ]
Ma, Shiqian [2 ]
So, Anthony Man-Cho [3 ,4 ]
Zhang, Tong [5 ]
机构
[1] Texas A&M Univ, Dept Ind & Syst Engn, College Stn, TX 77843 USA
[2] Univ Calif Davis, Dept Math, Davis, CA 95616 USA
[3] Chinese Univ Hong Kong, Dept Syst Engn & Engn Management, Sha Tin, Hong Kong, Peoples R China
[4] Chinese Univ Hong Kong, CUHK BGI Innovat Inst Trans, Sha Tin, Hong Kong, Peoples R China
[5] Hong Kong Univ Sci & Technol, Clear Water Bay, Hong Kong, Peoples R China
关键词
manifold optimization; Stiefel manifold; nonsmooth; proximal gradient method; iteration complexity; semismooth Newton method; sparse PCA; compressed modes; AUGMENTED LAGRANGIAN METHOD; LOCALLY LIPSCHITZ FUNCTIONS; LOG-DETERMINANT OPTIMIZATION; GENERALIZED POWER METHOD; SUBGRADIENT ALGORITHM; OPTIMALITY CONDITIONS; POINT ALGORITHM; LINE-SEARCH; MATRIX; CONVERGENCE;
D O I
10.1137/18M122457X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider optimization problems over the Stiefel manifold whose objective function is the summation of a smooth function and a nonsmooth function. Existing methods for solving this kind of problem can be classified into three categories. Algorithms in the first category rely on information of the subgradients of the objective function and thus tend to converge slowly in practice. Algorithms in the second category are proximal point algorithms, which involve subproblems that can be as difficult as the original problem. Algorithms in the third category are based on operator-splitting techniques, but they usually lack rigorous convergence guarantees. In this paper, we propose a retraction-based proximal gradient method for solving this class of problems. We prove that the proposed method globally converges to a stationary point. Iteration complexity for obtaining an epsilon-stationary solution is also analyzed. Numerical results on solving sparse PCA and compressed modes problems are reported to demonstrate the advantages of the proposed method.
引用
收藏
页码:210 / 239
页数:30
相关论文
共 50 条
  • [1] Nonsmooth Optimization over the Stiefel Manifold and Beyond: Proximal Gradient Method and Recent Variants
    Chen, Shixiang
    Ma, Shiqian
    So, Anthony Man-Cho
    Zhang, Tong
    [J]. SIAM REVIEW, 2024, 66 (02) : 319 - 352
  • [2] Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold
    Wang, Bokun
    Ma, Shiqian
    Xue, Lingzhou
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [3] Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold
    Wang, Bokun
    Ma, Shiqian
    Xue, Lingzhou
    [J]. Journal of Machine Learning Research, 2022, 23
  • [4] A constraint dissolving approach for nonsmooth optimization over the Stiefel manifold
    Hu, Xiaoyin
    Xiao, Nachuan
    Liu, Xin
    Toh, Kim-Chuan
    [J]. IMA JOURNAL OF NUMERICAL ANALYSIS, 2023,
  • [5] Proximal Quasi-Newton Method for Composite Optimization over the Stiefel Manifold
    Qinsi Wang
    Wei Hong Yang
    [J]. Journal of Scientific Computing, 2023, 95
  • [6] Proximal Quasi-Newton Method for Composite Optimization over the Stiefel Manifold
    Wang, Qinsi
    Yang, Wei Hong
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2023, 95 (02)
  • [8] A Riemannian conjugate gradient method for optimization on the Stiefel manifold
    Xiaojing Zhu
    [J]. Computational Optimization and Applications, 2017, 67 : 73 - 110
  • [9] A Scaled Gradient Projection Method for Minimization over the Stiefel Manifold
    Oviedo, Harry
    Dalmau, Oscar
    [J]. ADVANCES IN SOFT COMPUTING, MICAI 2019, 2019, 11835 : 239 - 250
  • [10] Transportless conjugate gradient for optimization on Stiefel manifold
    Fuentes Figueroa, Edgar
    Dalmau, Oscar
    [J]. COMPUTATIONAL & APPLIED MATHEMATICS, 2020, 39 (03):