The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization

被引:0
|
作者
Kovalev, Dmitry [1 ]
Gasnikov, Alexander [2 ,3 ,4 ]
机构
[1] King Abdullah Univ Sci & Technol, Thuwal, Saudi Arabia
[2] Moscow Inst Phys & Technol, Dolgoprudnyi, Russia
[3] RAS, Res Ctr Trusted Artificial Intelligence, Inst Syst Programming, Moscow, Russia
[4] Natl Res Univ Higher Sch Econ, Moscow, Russia
关键词
REGULARIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we revisit the smooth and strongly-convex-strongly-concave minimax optimization problem. Zhang et al. (2021) and Ibrahim et al. (2020) established the lower bound Omega(root kappa(x) kappa(y) log 1/epsilon) on the number of gradient evaluations required to find an epsilon-accurate solution, where kappa(x) and kappa(y) are condition numbers for the strong convexity and strong concavity assumptions. However, the existing state-of-the-art methods do not match this lower bound: algorithms of Lin et al. (2020) and Wang and Li (2020) have gradient evaluation complexity O(root kappa(x) kappa(y) log(3) 1/epsilon) and O(root kappa(x) kappa(y) log(3) (kappa(x) kappa(y)) log 1/epsilon), respectively. We fix this fundamental issue by providing the first algorithm with O(root kappa(x) kappa(y) log 1/epsilon) gradient evaluation complexity. We design our algorithm in three steps: (i) we reformulate the original problem as a minimization problem via the pointwise conjugate function; (ii) we apply a specific variant of the proximal point algorithm to the reformulated problem; (iii) we compute the proximal operator inexactly using the optimal algorithm for operator norm reduction in monotone inclusions.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] MIN-MAX-MIN OPTIMIZATION WITH SMOOTH AND STRONGLY CONVEX OBJECTIVES
    Lamperski, Jourdain
    Prokopyev, Oleg A.
    Wrabetz, Luca G.
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 2435 - 2456
  • [22] Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives
    Hendrikx, Hadrien
    Bach, Francis
    Massoulie, Laurent
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 897 - 906
  • [23] Finding Second-Order Stationary Points in Nonconvex-Strongly-Concave Minimax Optimization
    Luo, Luo
    Li, Yujun
    Chen, Cheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [24] Event-Triggered Consensus-Based Optimization Algorithm for Smooth and Strongly Convex Cost Functions
    Hayashi, Naoki
    Sugiura, Tomohiro
    Kajiyama, Yuichi
    Takai, Shigemasa
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 2120 - 2125
  • [25] Improved Algorithms for Convex-Concave Minimax Optimization
    Wang, Yuanhao
    Li, Jian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [26] Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks
    Kovalev, Dmitry
    Gasanov, Elnur
    Gasnikov, Alexander
    Richtarik, Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [27] On the oracle complexity of smooth strongly convex minimization
    Drori, Yoel
    Taylor, Adrien
    JOURNAL OF COMPLEXITY, 2022, 68
  • [28] About the Gradient Projection Algorithm for a Strongly Convex Function and a Proximally Smooth Set
    Balashov, Maxim V.
    JOURNAL OF CONVEX ANALYSIS, 2017, 24 (02) : 493 - 500
  • [29] An extrapolated fixed-point optimization method for strongly convex smooth optimizations
    Rakjarungkiat, Duangdaw
    Nimana, Nimit
    AIMS MATHEMATICS, 2024, 9 (02): : 4259 - 4280
  • [30] Variance Reduced EXTRA and DIGing and Their Optimal Acceleration for Strongly Convex Decentralized Optimization
    Li, Huan
    Lin, Zhouchen
    Fang, Yongchun
    Journal of Machine Learning Research, 2022, 23