Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems

被引:0
|
作者
Luo, Luo [1 ]
Ye, Haishan [2 ]
Huang, Zhichao [1 ]
Zhang, Tong [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Math, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Shenzhen Res Inst Big Data, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider nonconvex-concave minimax optimization problems of the form min(x) max(y is an element of Y) f (x; y), where f is strongly-concave in y but possibly nonconvex in x and Y is a convex and compact set. We focus on the stochastic setting, where we can only access an unbiased stochastic gradient estimate of f at each iteration. This formulation includes many machine learning applications as special cases such as robust optimization and adversary training. We are interested in finding an O(epsilon)-stationary point of the function Phi(center dot) = max(y is an element of Y) f (center dot, y). The most popular algorithm to solve this problem is stochastic gradient decent ascent, which requires O(kappa 3 epsilon(-4)) stochastic gradient evaluations, where kappa is the condition number. In this paper, we propose a novel method called Stochastic Recursive gradiEnt Descent Ascent (SREDA), which estimates gradients more efficiently using variance reduction. This method achieves the best known stochastic gradient complexity of O(kappa 3 epsilon(-4)), and its dependency on epsilon is optimal for this problem.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
    Chen, Lesi
    Ye, Haishan
    Luo, Luo
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [2] An accelerated first-order regularized momentum descent ascent algorithm for stochastic nonconvex-concave minimax problems
    Zhang, Huiling
    Xu, Zi
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2025, 90 (02) : 557 - 582
  • [3] DECENTRALIZED GRADIENT DESCENT MAXIMIZATION METHOD FOR COMPOSITE NONCONVEX STRONGLY-CONCAVE MINIMAX PROBLEMS
    Xu, Yangyang
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (01) : 1006 - 1044
  • [4] Zeroth-order algorithms for nonconvex-strongly-concave minimax problems with improved complexities
    Wang, Zhongruo
    Balasubramanian, Krishnakumar
    Ma, Shiqian
    Razaviyayn, Meisam
    JOURNAL OF GLOBAL OPTIMIZATION, 2023, 87 (2-4) : 709 - 740
  • [5] Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization
    Shen, Wei
    Huang, Minhui
    Zhang, Jiawei
    Shen, Cong
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [6] ALTERNATING PROXIMAL-GRADIENT STEPS FOR (STOCHASTIC) NONCONVEX-CONCAVE MINIMAX PROBLEMS
    Bot, Radu Ioan
    Boehm, Axel
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 1884 - 1913
  • [7] Generalization Bounds of Nonconvex-(Strongly)-Concave Stochastic Minimax Optimization
    Zhang, Siqi
    Hu, Yifan
    Zhang, Liang
    He, Niao
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [8] Nonconvex Stochastic Scaled Gradient Descent and Generalized Eigenvector Problems
    Li, Chris Junchi
    Jordan, Michael I.
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 1230 - 1240
  • [9] Randomized Stochastic Gradient Descent Ascent
    Sebbouh, Othmane
    Cuturi, Marco
    Peyre, Gabriel
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [10] Finding Second-Order Stationary Points in Nonconvex-Strongly-Concave Minimax Optimization
    Luo, Luo
    Li, Yujun
    Chen, Cheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,