Generalization Bounds of Nonconvex-(Strongly)-Concave Stochastic Minimax Optimization

被引:0
|
作者
Zhang, Siqi [1 ]
Hu, Yifan [2 ,3 ]
Zhang, Liang [3 ]
He, Niao [3 ]
机构
[1] Johns Hopkins Univ, Baltimore, MD 21218 USA
[2] Ecole Polytech Fed Lausanne, Lausanne, Switzerland
[3] Swiss Fed Inst Technol, Zurich, Switzerland
基金
瑞士国家科学基金会;
关键词
SAMPLE AVERAGE APPROXIMATION; COMPLEXITY; STABILITY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper studies the generalization performance of algorithms for solving nonconvex(strongly)-concave (NC-SC / NC-C) stochastic minimax optimization measured by the stationarity of primal functions. We first establish algorithm-agnostic generalization bounds via uniform convergence between the empirical minimax problem and the population minimax problem. The sample complexities for achieving.-generalization are (O) over tilde (d kappa(2)epsilon(-2)) and (O) over tilde (d epsilon(-4)) for NC-SC and NC-C settings, respectively, where d is the dimension of the primal variable and. is the condition number. We further study the algorithm-dependent generalization bounds via stability arguments of algorithms. In particular, we introduce a novel stability notion for minimax problems and build a connection between stability and generalization. As a result, we establish algorithm-dependent generalization bounds for stochastic gradient descent ascent (SGDA) and the more general sampling-determined algorithms (SDA).
引用
收藏
页数:31
相关论文
共 50 条
  • [1] An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
    Chen, Lesi
    Ye, Haishan
    Luo, Luo
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [2] Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems
    Luo, Luo
    Ye, Haishan
    Huang, Zhichao
    Zhang, Tong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [3] Finding Second-Order Stationary Points in Nonconvex-Strongly-Concave Minimax Optimization
    Luo, Luo
    Li, Yujun
    Chen, Cheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [4] Complexity Lower Bounds for Nonconvex-Strongly-Concave Min-Max Optimization
    Li, Haochuan
    Tian, Yi
    Zhang, Jingzhao
    Jadbabaie, Ali
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] SAPD plus : An Accelerated Stochastic Method for Nonconvex-Concave Minimax Problems
    Zhang, Xuan
    Aybat, Necdet Serhat
    Gurbuzbalaban, Mert
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [6] ALTERNATING PROXIMAL-GRADIENT STEPS FOR (STOCHASTIC) NONCONVEX-CONCAVE MINIMAX PROBLEMS
    Bot, Radu Ioan
    Boehm, Axel
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 1884 - 1913
  • [7] Zeroth-order algorithms for nonconvex-strongly-concave minimax problems with improved complexities
    Wang, Zhongruo
    Balasubramanian, Krishnakumar
    Ma, Shiqian
    Razaviyayn, Meisam
    JOURNAL OF GLOBAL OPTIMIZATION, 2023, 87 (2-4) : 709 - 740
  • [8] Zeroth-order algorithms for nonconvex–strongly-concave minimax problems with improved complexities
    Zhongruo Wang
    Krishnakumar Balasubramanian
    Shiqian Ma
    Meisam Razaviyayn
    Journal of Global Optimization, 2023, 87 : 709 - 740
  • [9] Exact Dual Bounds for Some Nonconvex Minimax Quadratic Optimization Problems
    Berezovskyi, O. A.
    CYBERNETICS AND SYSTEMS ANALYSIS, 2021, 57 (01) : 101 - 107
  • [10] DECENTRALIZED GRADIENT DESCENT MAXIMIZATION METHOD FOR COMPOSITE NONCONVEX STRONGLY-CONCAVE MINIMAX PROBLEMS
    Xu, Yangyang
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (01) : 1006 - 1044