ON THE CONVERGENCE OF MIRROR DESCENT BEYOND STOCHASTIC CONVEX PROGRAMMING

被引:15
|
作者
Zhou, Zhengyuan [1 ,2 ]
Mertikopoulos, Panayotis [3 ]
Bambos, Nicholas [4 ]
Boyd, Stephen P. [4 ]
Glynn, Peter W. [4 ]
机构
[1] IBM Res, Yorktown Hts, NY 10598 USA
[2] NYU, Stern Sch Business, 550 1St Ave, New York, NY 10012 USA
[3] Univ Grenoble Alpes, CNRS, Grenoble INP, Inria,LIG, F-38000 Grenoble, France
[4] Stanford Univ, Dept Elect Engn, Dept Management Sci & Engn, Stanford, CA 94305 USA
关键词
mirror descent; nonconvex programming; stochastic optimization; stochastic approximation; variational coherence; EXPONENTIATED GRADIENT; ALGORITHM;
D O I
10.1137/17M1134925
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we examine the convergence of mirror descent in a class of stochastic optimization problems that are not necessarily convex (or even quasi-convex) and which we call variationally coherent. Since the standard technique of "ergodic averaging" offers no tangible benefits beyond convex programming, we focus directly on the algorithm's last generated sample (its "last iterate"), and we show that it converges with probabiility 1 if the underlying problem is coherent. We further consider a localized version of variational coherence which ensures local convergence of Stochastic mirror descent (SMD) with high probability. These results contribute to the landscape of nonconvex stochastic optimization by showing that (quasi-)convexity is not essential for convergence to a global minimum: rather, variational coherence, a much weaker requirement, suffices. Finally, building on the above, we reveal an interesting insight regarding the convergence speed of SMD: in problems with sharp minima (such as generic linear programs or concave minimization problems), SMD reaches a minimum point in a finite number of steps (a.s.), even in the presence of persistent gradient noise. This result is to be contrasted with existing black-box convergence rate estimates that are only asymptotic.
引用
收藏
页码:687 / 716
页数:30
相关论文
共 50 条
  • [1] A CHARACTERIZATION OF STOCHASTIC MIRROR DESCENT ALGORITHMS AND THEIR CONVERGENCE PROPERTIES
    Azizan, Navid
    Hassibi, Babak
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5167 - 5171
  • [2] Optimal distributed stochastic mirror descent for strongly convex optimization
    Yuan, Deming
    Hong, Yiguang
    Ho, Daniel W. C.
    Jiang, Guoping
    [J]. AUTOMATICA, 2018, 90 : 196 - 203
  • [3] Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization
    Nazin, A. V.
    [J]. AUTOMATION AND REMOTE CONTROL, 2018, 79 (01) : 78 - 88
  • [4] Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization
    A. V. Nazin
    [J]. Automation and Remote Control, 2018, 79 : 78 - 88
  • [5] Stochastic Mirror Descent Dynamics and Their Convergence in Monotone Variational Inequalities
    Mertikopoulos, Panayotis
    Staudigl, Mathias
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2018, 179 (03) : 838 - 867
  • [6] Stochastic Mirror Descent Dynamics and Their Convergence in Monotone Variational Inequalities
    Panayotis Mertikopoulos
    Mathias Staudigl
    [J]. Journal of Optimization Theory and Applications, 2018, 179 : 838 - 867
  • [7] Event-Triggered Distributed Stochastic Mirror Descent for Convex Optimization
    Xiong, Menghui
    Zhang, Baoyong
    Ho, Daniel W. C.
    Yuan, Deming
    Xu, Shengyuan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 6480 - 6491
  • [8] Almost sure convergence of stochastic composite objective mirror descent for non-convex non-smooth optimization
    Liang, Yuqing
    Xu, Dongpo
    Zhang, Naimin
    Mandic, Danilo P.
    [J]. OPTIMIZATION LETTERS, 2023, 18 (09) : 2113 - 2131
  • [9] On stochastic mirror descent with interacting particles: Convergence properties and variance reduction
    Borovykh, A.
    Kantas, N.
    Parpas, P.
    Pavliotis, G. A.
    [J]. PHYSICA D-NONLINEAR PHENOMENA, 2021, 418
  • [10] On the Convergence of (Stochastic) Gradient Descent with Extrapolation for Non-Convex Minimization
    Xu, Yi
    Yuan, Zhuoning
    Yang, Sen
    Jin, Rong
    Yang, Tianbao
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4003 - 4009