Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems

被引:9
|
作者
Latafat, Puya [1 ]
Themelis, Andreas [2 ]
Patrinos, Panagiotis [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn ESAT STADIUS, Kasteelpk Arenberg 10, B-3001 Leuven, Belgium
[2] Kyushu Univ, Fac Informat Sci & Elect Engn ISEE, Nishi Ku, 744 Motooka, Fukuoka 8190395, Japan
关键词
Nonsmooth nonconvex optimization; Block-coordinate updates; Forward-backward envelope; KL inequality; PRIMAL-DUAL ALGORITHM; DESCENT METHOD; OPTIMIZATION; CONVERGENCE; MINIMIZATION;
D O I
10.1007/s10107-020-01599-7
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
This paper analyzes block-coordinate proximal gradient methods for minimizing the sum of a separable smooth function and a (nonseparable) nonsmooth function, both of which are allowed to be nonconvex. The main tool in our analysis is the forward-backward envelope, which serves as a particularly suitable continuous and real-valued Lyapunov function. Global and linear convergence results are established when the cost function satisfies the Kurdyka-Lojasiewicz property without imposing convexity requirements on the smooth function. Two prominent special cases of the investigated setting are regularized finite sum minimization and the sharing problem; in particular, an immediate byproduct of our analysis leads to novel convergence results and rates for the popular Finito/MISO algorithm in the nonsmooth and nonconvex setting with very general sampling strategies.
引用
收藏
页码:195 / 224
页数:30
相关论文
共 50 条
  • [1] Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
    Puya Latafat
    Andreas Themelis
    Panagiotis Patrinos
    [J]. Mathematical Programming, 2022, 193 : 195 - 224
  • [2] Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
    Liu, Y. C.
    Xia, F. Q.
    [J]. APPLICABLE ANALYSIS, 2022, 101 (09) : 3445 - 3464
  • [3] On the convergence of a Block-Coordinate Incremental Gradient method
    Laura Palagi
    Ruggiero Seccia
    [J]. Soft Computing, 2021, 25 : 12615 - 12626
  • [4] On the convergence of a Block-Coordinate Incremental Gradient method
    Palagi, Laura
    Seccia, Ruggiero
    [J]. SOFT COMPUTING, 2021, 25 (19) : 12615 - 12626
  • [5] Global Convergence Rate of Incremental Aggregated Gradient Methods for Nonsmooth Problems
    Vanli, N. Denizcan
    Gurbuzbalaban, Mert
    Ozdaglar, Asuman
    [J]. 2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC), 2016, : 173 - 178
  • [6] Randomized block-coordinate adaptive algorithms for nonconvex optimization problems
    Zhou, Yangfan
    Huang, Kaizhu
    Li, Jiang
    Cheng, Cheng
    Wang, Xuguang
    Hussian, Amir
    Liu, Xin
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 121
  • [7] Nonconvex Proximal Incremental Aggregated Gradient Method with Linear Convergence
    Peng, Wei
    Zhang, Hui
    Zhang, Xiaoya
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2019, 183 (01) : 230 - 245
  • [8] Nonconvex Proximal Incremental Aggregated Gradient Method with Linear Convergence
    Wei Peng
    Hui Zhang
    Xiaoya Zhang
    [J]. Journal of Optimization Theory and Applications, 2019, 183 : 230 - 245
  • [9] A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems
    Wang, Ting
    Liu, Hongwei
    [J]. NUMERICAL ALGORITHMS, 2024, 95 (01) : 207 - 241
  • [10] A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems
    Ting Wang
    Hongwei Liu
    [J]. Numerical Algorithms, 2024, 95 : 207 - 241