Faster Stochastic Algorithms for Minimax Optimization under Polyak-Lojasiewicz Conditions

被引:0
|
作者
Chen, Lesi [1 ]
Yao, Boyuan [1 ]
Luo, Luo [1 ]
机构
[1] Fudan Univ, Sch Data Sci, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper considers stochastic first-order algorithms for minimax optimization under Polyak-Lojasiewicz (PL) conditions. We propose SPIDER-GDA for solving the finite-sum problem of the form min(x) max(y) f(x, y) (sic) 1/n Sigma(n)(i=1) f(i)(x, y), where the objective function f(x, y) is mu(x)-PL in x and mu(y)-PL in y; and each f(i)(x, y) is L-smooth. We prove SPIDER-GDA could find an.-approximate solution within O((n + root n kappa(x)kappa(2)(y)) log(1/epsilon)) stochastic first-order oracle (SFO) complexity, which is better than the state-of-the-art method whose SFO upper bound is O((n + n(2/3)kappa(x)kappa(2)(y)) log(1/epsilon)), where kappa(x) (sic) L/mu(x) and kappa(y) (sic) L/mu(y). For the ill-conditioned case, we provide an accelerated algorithm to reduce the computational cost further. It achieves (O) over tilde (n + root n kappa(x)kappa(y)) log(2) (1/epsilon)) SFO upper bound when kappa(y) greater than or similar to root n. Our ideas also can be applied to the more general setting that the objective function only satisfies PL condition for one variable. Numerical experiments validate the superiority of proposed methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Distributionally Time-Varying Online Stochastic Optimization under Polyak- Lojasiewicz Condition with Application in Conditional Value-at-Risk Statistical Learning
    Pun, Yuen-Man
    Farokhi, Farhad
    Shames, Iman
    arXiv, 2023,
  • [22] Faster Single-loop Algorithms for Minimax Optimization without Strong Concavity
    Yang, Junchi
    Orvieto, Antonio
    Lucchi, Aurelien
    He, Niao
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [23] Sharp Analysis of Stochastic Optimization under Global Kurdyka-Lojasiewicz Inequality
    Fatkhullin, Ilyas
    Etesami, Jalal
    He, Niao
    Kiyavash, Negar
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [24] Sharp Analysis of Stochastic Optimization under Global Kurdyka-Lojasiewicz Inequality
    Fatkhullin, Ilyas
    Etesami, Jalal
    He, Niao
    Kiyavash, Negar
    arXiv, 2022,
  • [25] Gradient-Free Algorithms for Solving Stochastic Saddle Optimization Problems with the Polyak–Łojasiewicz Condition
    S. I. Sadykov
    A. V. Lobanov
    A. M. Raigorodskii
    Programming and Computer Software, 2023, 49 : 535 - 547
  • [26] Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
    Chen, Lesi
    Xu, Jing
    Luo, Luo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [27] OPTIMALITY CONDITIONS AND NUMERICAL ALGORITHMS FOR A CLASS OF LINEARLY CONSTRAINED MINIMAX OPTIMIZATION PROBLEMS
    Dai, Yu-Hong
    Wang, Jiani
    Zhang, Liwei
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (03) : 2883 - 2916
  • [28] Convergence of stochastic approximation algorithms under irregular conditions
    Zhang, Jian
    Liang, Faming
    STATISTICA NEERLANDICA, 2008, 62 (03) : 393 - 403
  • [30] Performance analysis of stochastic gradient algorithms under weak conditions
    Feng Ding
    HuiZhong Yang
    Fei Liu
    Science in China Series F: Information Sciences, 2008, 51 : 1269 - 1280