The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization

被引:0
|
作者
Kovalev, Dmitry [1 ]
Gasnikov, Alexander [2 ,3 ,4 ]
机构
[1] King Abdullah Univ Sci & Technol, Thuwal, Saudi Arabia
[2] Moscow Inst Phys & Technol, Dolgoprudnyi, Russia
[3] RAS, Res Ctr Trusted Artificial Intelligence, Inst Syst Programming, Moscow, Russia
[4] Natl Res Univ Higher Sch Econ, Moscow, Russia
关键词
REGULARIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we revisit the smooth and strongly-convex-strongly-concave minimax optimization problem. Zhang et al. (2021) and Ibrahim et al. (2020) established the lower bound Omega(root kappa(x) kappa(y) log 1/epsilon) on the number of gradient evaluations required to find an epsilon-accurate solution, where kappa(x) and kappa(y) are condition numbers for the strong convexity and strong concavity assumptions. However, the existing state-of-the-art methods do not match this lower bound: algorithms of Lin et al. (2020) and Wang and Li (2020) have gradient evaluation complexity O(root kappa(x) kappa(y) log(3) 1/epsilon) and O(root kappa(x) kappa(y) log(3) (kappa(x) kappa(y)) log 1/epsilon), respectively. We fix this fundamental issue by providing the first algorithm with O(root kappa(x) kappa(y) log 1/epsilon) gradient evaluation complexity. We design our algorithm in three steps: (i) we reformulate the original problem as a minimization problem via the pointwise conjugate function; (ii) we apply a specific variant of the proximal point algorithm to the reformulated problem; (iii) we compute the proximal operator inexactly using the optimal algorithm for operator norm reduction in monotone inclusions.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] An Optimal Algorithm for Bandit Convex Optimization with Strongly-Convex and Smooth Loss
    Ito, Shinji
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2229 - 2238
  • [2] An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
    Chen, Lesi
    Ye, Haishan
    Luo, Luo
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [3] Optimal Algorithms for Smooth and Strongly Convex Distributed Optimization in Networks
    Scaman, Kevin
    Bach, Francis
    Bubeck, Sebastien
    Lee, Yin Tat
    Massoulie, Laurent
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [4] Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization
    Kovalev, Dmitry
    Salim, Adil
    Richtarik, Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [5] Stochastic algorithm with optimal convergence rate for strongly convex optimization problems
    Shao, Yan-Jian, 1600, Chinese Academy of Sciences (25):
  • [6] Generalization Bounds of Nonconvex-(Strongly)-Concave Stochastic Minimax Optimization
    Zhang, Siqi
    Hu, Yifan
    Zhang, Liang
    He, Niao
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [7] An optimal gradient method for smooth strongly convex minimization
    Taylor, Adrien
    Drori, Yoel
    MATHEMATICAL PROGRAMMING, 2023, 199 (1-2) : 557 - 594
  • [8] An optimal gradient method for smooth strongly convex minimization
    Adrien Taylor
    Yoel Drori
    Mathematical Programming, 2023, 199 : 557 - 594
  • [9] On Lower and Upper Bounds in Smooth and Strongly Convex Optimization
    Arjevani, Yossi
    Shalev-Shwartz, Shai
    Shamir, Ohad
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [10] Smooth Strongly Convex Regression
    Simonetto, Andrea
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 2130 - 2134