On the convergence of the direct extension of ADMM for three-block separable convex minimization models with one strongly convex function

被引:72
|
作者
Cai, Xingju [1 ]
Han, Deren [1 ]
Yuan, Xiaoming [2 ]
机构
[1] Nanjing Normal Univ, Sch Math Sci, Key Lab NSLSCS Jiangsu Prov, Nanjing 210023, Jiangsu, Peoples R China
[2] Hong Kong Baptist Univ, Dept Math, Hong Kong, Hong Kong, Peoples R China
关键词
Alternating direction method of multipliers; Convergence analysis; Convex programming; Separable structure; LOCAL LINEAR CONVERGENCE; SPLITTING METHOD; LOW-RANK; MULTIPLIERS;
D O I
10.1007/s10589-016-9860-y
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The alternating direction method of multipliers (ADMM) is a benchmark for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. Meanwhile, it is known that the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective function has more than two functions. Recently, some authors have actively studied the strong convexity condition on the objective function to sufficiently ensure the convergence of the direct extension of ADMM or the resulting convergence when the original scheme is appropriately twisted. We focus on the three-block case of such a model whose objective function is the sum of three functions, and discuss the convergence of the direct extension of ADMM. We show that when one function in the objective is strongly convex, the penalty parameter and the operators in the linear equality constraint are appropriately restricted, it is sufficient to guarantee the convergence of the direct extension of ADMM. We further estimate the worst-case convergence rate measured by the iteration complexity in both the ergodic and nonergodic senses, and derive the globally linear convergence in asymptotical sense under some additional conditions.
引用
下载
收藏
页码:39 / 73
页数:35
相关论文
共 28 条
  • [21] Modified hybrid decomposition of the augmented Lagrangian method with larger step size for three-block separable convex programming
    Min Sun
    Yiju Wang
    Journal of Inequalities and Applications, 2018
  • [22] Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
    Chen, Caihua
    Li, Min
    Liu, Xin
    Ye, Yinyu
    MATHEMATICAL PROGRAMMING, 2019, 173 (1-2) : 37 - 77
  • [23] Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
    Caihua Chen
    Min Li
    Xin Liu
    Yinyu Ye
    Mathematical Programming, 2019, 173 : 37 - 77
  • [24] On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
    Li, Xingguo
    Zhao, Tuo
    Arora, Raman
    Liu, Han
    Hong, Mingyi
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [25] Modified hybrid decomposition of the augmented Lagrangian method with larger step size for three-block separable convex programming
    Sun, Min
    Wang, Yiju
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2018,
  • [26] An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
    Li, Xingguo
    Zhao, Tuo
    Arora, Raman
    Liu, Han
    Hong, Mingyi
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 491 - 499
  • [27] On the Q-Linear Convergence of Distributed Generalized ADMM Under Non-Strongly Convex Function Components
    Maros, Marie
    Jalden, Joakim
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2019, 5 (03): : 442 - 453
  • [28] ONE CLASS OF METHODS OF UNCONDITIONAL MINIMIZATION OF A CONVEX FUNCTION, HAVING A HIGH-RATE OF CONVERGENCE
    NESTEROV, YE
    USSR COMPUTATIONAL MATHEMATICS AND MATHEMATICAL PHYSICS, 1984, 24 (04): : 80 - 82