Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold

被引:0
|
作者
Wang, Bokun [1 ]
Ma, Shiqian [2 ]
Xue, Lingzhou [3 ]
机构
[1] Department of Computer Science, The University of Iowa, Iowa City,IA,52242, United States
[2] Department of Mathematics, University of California, One Shields Avenue, Davis,CA,95616, United States
[3] Department of Statistics, Pennsylvania State University, University Park,PA,16802, United States
基金
美国国家科学基金会;
关键词
Optimization - Geometry - Stochastic systems - Machine learning;
D O I
暂无
中图分类号
学科分类号
摘要
Riemannian optimization has drawn a lot of attention due to its wide applications in practice. Riemannian stochastic first-order algorithms have been studied in the literature to solve large-scale machine learning problems over Riemannian manifolds. However, most of the existing Riemannian stochastic algorithms require the objective function to be differentiable, and they do not apply to the case where the objective function is nonsmooth. In this paper, we present two Riemannian stochastic proximal gradient methods for minimizing nonsmooth function over the Stiefel manifold. The two methods, named R-ProxSGD and R-ProxSPB, are generalizations of proximal SGD and proximal SpiderBoost in Euclidean setting to the Riemannian setting. Analysis on the incremental first-order oracle (IFO) complexity of the proposed algorithms is provided. Specifically, the R-ProxSPB algorithm finds an ϵ-stationary point with O(ϵ-3) IFOs in the online case, and O(n + √ nϵ-2) IFOs in the finite-sum case with n being the number of summands in the objective. Experimental results on online sparse PCA and robust low-rank matrix completion show that our proposed methods significantly outperform the existing methods that use Riemannian subgradient information. © 2022 Bokun Wang, Shiqian Ma, Lingzhou Xue.
引用
收藏
相关论文
共 50 条
  • [1] Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold
    Wang, Bokun
    Ma, Shiqian
    Xue, Lingzhou
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [2] PROXIMAL GRADIENT METHOD FOR NONSMOOTH OPTIMIZATION OVER THE STIEFEL MANIFOLD
    Chen, Shixiang
    Ma, Shiqian
    So, Anthony Man-Cho
    Zhang, Tong
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 210 - 239
  • [3] Nonsmooth Optimization over the Stiefel Manifold and Beyond: Proximal Gradient Method and Recent Variants
    Chen, Shixiang
    Ma, Shiqian
    So, Anthony Man-Cho
    Zhang, Tong
    [J]. SIAM REVIEW, 2024, 66 (02) : 319 - 352
  • [5] A Riemannian conjugate gradient method for optimization on the Stiefel manifold
    Xiaojing Zhu
    [J]. Computational Optimization and Applications, 2017, 67 : 73 - 110
  • [6] A constraint dissolving approach for nonsmooth optimization over the Stiefel manifold
    Hu, Xiaoyin
    Xiao, Nachuan
    Liu, Xin
    Toh, Kim-Chuan
    [J]. IMA JOURNAL OF NUMERICAL ANALYSIS, 2023,
  • [7] RIEMANNIAN OPTIMIZATION ON THE SYMPLECTIC STIEFEL MANIFOLD
    Gao, Bin
    Nguyen Thanh Son
    Absil, P-A
    Stykel, Tatjana
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (02) : 1546 - 1575
  • [8] Decentralized Riemannian Gradient Descent on the Stiefel Manifold
    Chen, Shixiang
    Garcia, Alfredo
    Hong, Mingyi
    Shahrampour, Shahin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] WEAKLY CONVEX OPTIMIZATION OVER STIEFEL MANIFOLD USING RIEMANNIAN SUBGRADIENT-TYPE METHODS
    Li, Xiao
    Chen, Shixiang
    Deng, Zengde
    Qu, Qing
    Zhu, Zhihui
    So, Anthony Man-Cho
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (03) : 1605 - 1634
  • [10] Faster Gradient-Free Proximal Stochastic Methods for Nonconvex Nonsmooth Optimization
    Huang, Feihu
    Gu, Bin
    Huo, Zhouyuan
    Chen, Songcan
    Huang, Heng
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1503 - 1510