A Bregman-Style Improved ADMM and its Linearized Version in the Nonconvex Setting: Convergence and Rate Analyses

被引:0
|
作者
Liu, Peng-Jie [1 ,2 ,3 ]
Jian, Jin-Bao [2 ]
Shao, Hu [1 ]
Wang, Xiao-Quan [1 ]
Xu, Jia-Wei [4 ]
Wu, Xiao-Yu [1 ]
机构
[1] China Univ Min & Technol, Jiangsu Ctr Appl Math, Sch Math, Xuzhou 221116, Jiangsu, Peoples R China
[2] Guangxi Minzu Univ, Ctr Appl Math Guangxi, Sch Math & Phys, Nanning 530006, Guangxi, Peoples R China
[3] Hong Kong Polytech Univ, Dept Civil & Environm Engn, Hong Kong, Peoples R China
[4] Xiangtan Univ, Sch Math & Computat Sci, Xiangtan 411105, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Nonconvex optimization; Alternating direction method of multipliers; Kurdyka-Lojasiewicz property; Convergence rate; ALTERNATING DIRECTION METHOD; L-1/2; REGULARIZATION; MULTIPLIERS; OPTIMIZATION; MINIMIZATION; ALGORITHMS; SUM;
D O I
10.1007/s40305-023-00535-8
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
This work explores a family of two-block nonconvex optimization problems subject to linear constraints. We first introduce a simple but universal Bregman-style improved alternating direction method of multipliers (ADMM) based on the iteration framework of ADMM and the Bregman distance. Then, we utilize the smooth performance of one of the components to develop a linearized version of it. Compared to the traditional ADMM, both proposed methods integrate a convex combination strategy into the multiplier update step. For each proposed method, we demonstrate the convergence of the entire iteration sequence to a unique critical point of the augmented Lagrangian function utilizing the powerful Kurdyka-Lojasiewicz property, and we also derive convergence rates for both the sequence of merit function values and the iteration sequence. Finally, some numerical results show that the proposed methods are effective and encouraging for the Lasso model.
引用
收藏
页码:298 / 340
页数:43
相关论文
共 3 条
  • [1] A Bregman-Style Improved ADMM and its Linearized Version in the Nonconvex Setting: Convergence and Rate Analyses
    Peng-Jie Liu
    Jin-Bao Jian
    Hu Shao
    Xiao-Quan Wang
    Jia-Wei Xu
    Xiao-Yu Wu
    Journal of the Operations Research Society of China, 2024, 12 : 298 - 340
  • [2] Convergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimization
    Yashtini, Maryam
    JOURNAL OF GLOBAL OPTIMIZATION, 2022, 84 (04) : 913 - 939
  • [3] Convergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimization
    Maryam Yashtini
    Journal of Global Optimization, 2022, 84 : 913 - 939