ACCELERATED STOCHASTIC ALGORITHMS FOR NONCONVEX FINITE-SUM AND MULTIBLOCK OPTIMIZATION

被引:6
|
作者
Lan, Guanghui [1 ]
Yang, Yu [1 ]
机构
[1] Georgia Inst Technol, H Milton Stewart Sch Ind & Syst Engn, Atlanta, GA 30332 USA
基金
美国国家科学基金会;
关键词
nonconvex optimization; stochastic algorithms; acceleration; finite-sum optimization; multiblock optimization; VARIABLE SELECTION; VARIANCE REDUCTION;
D O I
10.1137/18M1192536
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we present new stochastic methods for solving two important classes of nonconvex optimization problems. We first introduce a randomized accelerated proximal gradient (RapGrad) method for solving a class of nonconvex optimization problems whose objective function consists of the summation of m components and show that it can significantly reduce the number of gradient computations especially when the condition number L/mu (i.e., the ratio between the Lipschitz constant and negative curvature) is large. More specifically, RapGrad can save up to O(root m) gradient computations more than existing batch nonconvex accelerated gradient methods. Moreover, the number of gradient computations required by RapGrad can be O(m(1/6)L(1/2)/mu(1/2)) (at least O(m(2/3))) times smaller than the best-known randomized nonconvex gradient methods when L/mu >= m. Inspired by RapGrad, we also develop a new randomized accelerated proximal dual (RapDual) method for solving a class of multiblock nonconvex optimization problems coupled with linear constraints and some special structural properties. We demonstrate that RapDual can also save up to a factor of O(root m) block updates more than its batch counterpart, where m denotes the number of blocks. To the best of our knowledge, all these complexity results associated with RapGrad and RapDual seem to be new in the literature. We also illustrate potential advantages of these algorithms through our preliminary numerical experiments.
引用
收藏
页码:2753 / 2784
页数:32
相关论文
共 50 条
  • [1] An accelerated stochastic ADMM for nonconvex and nonsmooth finite-sum optimization
    Zeng, Yuxuan
    Wang, Zhiguo
    Bai, Jianchao
    Shen, Xiaojing
    [J]. AUTOMATICA, 2024, 163
  • [2] Proximal Stochastic Methods for Nonsmooth Nonconvex Finite-Sum Optimization
    Reddi, Sashank J.
    Sra, Suvrit
    Poczos, Barnabas
    Smola, Alexander J.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [3] Lower Bounds for Smooth Nonconvex Finite-Sum Optimization
    Zhou, Dongruo
    Gu, Quanquan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [4] Efficient Decentralized Stochastic Gradient Descent Method for Nonconvex Finite-Sum Optimization Problems
    Zhan, Wenkang
    Wu, Gang
    Gao, Hongchang
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 9006 - 9013
  • [5] Incremental quasi-Newton algorithms for solving a nonconvex, nonsmooth, finite-sum optimization problem
    Yalcin, Gulcin Dinc
    Curtis, Frank E.
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2024, 39 (02): : 345 - 367
  • [6] FAST DECENTRALIZED NONCONVEX FINITE-SUM OPTIMIZATION WITH RECURSIVE VARIANCE REDUCTION
    Xin, Ran
    Khan, Usman A.
    Kar, Soummya
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2022, 32 (01) : 1 - 28
  • [7] A New Random Reshuffling Method for Nonsmooth Nonconvex Finite-sum Optimization
    Qiu, Junwen
    Li, Xiao
    Milzarek, Andre
    [J]. arXiv, 2023,
  • [8] GEOM-SPIDER-EM: FASTER VARIANCE REDUCED STOCHASTIC EXPECTATION MAXIMIZATION FOR NONCONVEX FINITE-SUM OPTIMIZATION
    Fort, Gersende
    Moulines, Eric
    Wai, Hoi-To
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3135 - 3139
  • [9] Finite-Sum Coupled Compositional Stochastic Optimization: Theory and Applications
    Wang, Bokun
    Yang, Tianbao
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization
    Song, Chaobing
    Jinag, Yong
    Ma, Yi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33