Alternating and Parallel Proximal Gradient Methods for Nonsmooth, Nonconvex Minimax: A Unified Convergence Analysis

被引:1
|
作者
Cohen, Eyal [1 ]
Teboulle, Marc [1 ]
机构
[1] Tel Aviv Univ, Sch Math Sci, IL-69978 Tel Aviv, Israel
基金
以色列科学基金会;
关键词
nonconvex nonsmooth minimax; nonsmooth minimization-maximization; proximal gradient method; Kurdyka-Lojasiewicz property; Bregman distance; global convergence; convergence rate; 1ST-ORDER METHODS; MIN; ALGORITHM; MINIMIZATION; CONTINUITY;
D O I
10.1287/moor.2022.0294
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
There is growing interest in nonconvex minimax problems that is driven by an abundance of applications. Our focus is on nonsmooth, nonconvex-strongly concave minimax, thus departing from the more common weakly convex and smooth models assumed in the recent literature. We present proximal gradient schemes with either parallel or alternating steps. We show that both methods can be analyzed through a single scheme within a unified analysis that relies on expanding a general convergence mechanism used for analyzing nonconvex, nonsmooth optimization problems. In contrast to the current literature, which focuses on the complexity of obtaining nearly approximate stationary solutions, we prove subsequence convergence to a critical point of the primal objective and global convergence when the latter is semialgebraic. Furthermore, the complexity results we provide are with respect to approximate stationary solutions. Lastly, we expand the scope of problems that can be addressed by generalizing one of the steps with a Bregman proximal gradient update, and together with a few adjustments to the analysis, this allows us to extend the convergence and complexity results to this broader setting.
引用
收藏
页码:1 / 28
页数:29
相关论文
共 50 条
  • [41] A UNIFIED CONVERGENCE ANALYSIS OF BLOCK SUCCESSIVE MINIMIZATION METHODS FOR NONSMOOTH OPTIMIZATION
    Razaviyayn, Meisam
    Hong, Mingyi
    Luo, Zhi-Quan
    SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (02) : 1126 - 1153
  • [42] On the convergence of adaptive first order methods: proximal gradient and alternating minimization algorithms
    Latafat, Puya
    Themelis, Andreas
    Patrinos, Panagiotis
    6TH ANNUAL LEARNING FOR DYNAMICS & CONTROL CONFERENCE, 2024, 242 : 197 - 208
  • [43] Accelerated Proximal Gradient Methods for Nonconvex Programming
    Li, Huan
    Lin, Zhouchen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [44] A UNIFIED PROXIMAL GRADIENT METHOD FOR NONCONVEX COMPOSITE OPTIMIZATION WITH EXTRAPOLATION
    Zhang, Miao
    Zhang, Hongchao
    NUMERICAL ALGEBRA CONTROL AND OPTIMIZATION, 2024,
  • [45] Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization∗
    Li, Zhize
    Li, Jian
    Journal of Machine Learning Research, 2022, 23
  • [46] Linearized ADMM for Nonconvex Nonsmooth Optimization With Convergence Analysis
    Liu, Qinghua
    Shen, Xinyue
    Gu, Yuantao
    IEEE ACCESS, 2019, 7 : 76131 - 76144
  • [47] Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [48] Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    arXiv, 2022,
  • [49] An Alternating Gradient Projection Algorithm with Momentum for Nonconvex-Concave Minimax Problems
    Li, Jue-You
    Xie, Tao
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2024,
  • [50] A unified single-loop alternating gradient projection algorithm for nonconvex-concave and convex-nonconcave minimax problems
    Xu, Zi
    Zhang, Huiling
    Xu, Yang
    Lan, Guanghui
    MATHEMATICAL PROGRAMMING, 2023, 201 (1-2) : 635 - 706