Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds

被引:13
|
作者
Zhou, Pan [1 ]
Yuan, Xiao-Tong [2 ]
Yan, Shuicheng [1 ]
Feng, Jiashi [1 ]
机构
[1] Natl Univ Singapore, Dept Elect & Comp Engn, Singapore, Singapore
[2] Nanjing Univ Informat Sci & Technol, Sch Automat, Nanjing 210044, Peoples R China
关键词
Optimization; Complexity theory; Manifolds; Convergence; Signal processing algorithms; Stochastic processes; Minimization; Riemannian optimization; stochastic variance-reduced algorithm; non-convex optimization; online learning; ILLUMINATION; COMPLETION;
D O I
10.1109/TPAMI.2019.2933841
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
First-order non-convex Riemannian optimization algorithms have gained recent popularity in structured machine learning problems including principal component analysis and low-rank matrix completion. The current paper presents an efficient Riemannian Stochastic Path Integrated Differential EstimatoR (R-SPIDER) algorithm to solve the finite-sum and online Riemannian non-convex minimization problems. At the core of R-SPIDER is a recursive semi-stochastic gradient estimator that can accurately estimate Riemannian gradient under not only exponential mapping and parallel transport, but also general retraction and vector transport operations. Compared with prior Riemannian algorithms, such a recursive gradient estimation mechanism endows R-SPIDER with lower computational cost in first-order oracle complexity. Specifically, for finite-sum problems with n components, R-SPIDER is proved to converge to an epsilon-approximate stationary point within O(min(n + root n/epsilon(2),1/epsilon(3))) stochastic gradient evaluations, beating the best-known complexity O(n+1/epsilon(4)); for online optimization, R-SPIDER is shown to converge with O(1/epsilon(3)) complexity which is, to the best of our knowledge, the first non-asymptotic result for online Riemannian optimization. For the special case of gradient dominated functions, we further develop a variant of R-SPIDER with improved linear rate of convergence. Extensive experimental results demonstrate the advantage of the proposed algorithms over the state-of-the-art Riemannian non-convex optimization methods.
引用
收藏
页码:459 / 472
页数:14
相关论文
共 50 条
  • [1] Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
    Zhou, Pan
    Yuan, Xiao-Tong
    Feng, Jiashi
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 138 - 147
  • [2] An Accelerated First-Order Method for Non-convex Optimization on Manifolds
    Christopher Criscitiello
    Nicolas Boumal
    Foundations of Computational Mathematics, 2023, 23 : 1433 - 1509
  • [3] An Accelerated First-Order Method for Non-convex Optimization on Manifolds
    Criscitiello, Christopher
    Boumal, Nicolas
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2023, 23 (04) : 1433 - 1509
  • [4] Accelerated First-order Methods for Geodesically Convex Optimization on Riemannian Manifolds
    Liu, Yuanyuan
    Shang, Fanhua
    Cheng, James
    Cheng, Hong
    Jiao, Licheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [5] Leveraging Non-uniformity in First-order Non-convex Optimization
    Mei, Jincheng
    Gao, Yue
    Dai, Bo
    Szepesvari, Csaba
    Schuurmans, Dale
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [6] Natasha: Faster Non-Convex Stochastic Optimization via Strongly Non-Convex Parameter
    Allen-Zhu, Zeyuan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [7] First-Order Methods for Fast Feasibility Pursuit of Non-convex QCQPs
    Konar, Aritra
    Sidiropoulos, Nicholas D.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (22) : 5927 - 5941
  • [8] Riemannian Stochastic Recursive Momentum Method for non-Convex Optimization
    Han, Andi
    Gao, Junbin
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2505 - 2511
  • [9] First-Order Methods for Convex Optimization
    Dvurechensky, Pavel
    Shtern, Shimrit
    Staudigl, Mathias
    EURO JOURNAL ON COMPUTATIONAL OPTIMIZATION, 2021, 9
  • [10] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Digvijay Boob
    Qi Deng
    Guanghui Lan
    Mathematical Programming, 2023, 197 : 215 - 279