On the linear convergence of the alternating direction method of multipliers

被引:9
|
作者
Mingyi Hong
Zhi-Quan Luo
机构
[1] University of Minnesota,Department of Electrical and Computer Engineering
[2] The Chinese University of Hong Kong,School of Science and Engineering
[3] Iowa State University,Department of Industrial and Manufacturing Systems Engineering
来源
Mathematical Programming | 2017年 / 162卷
关键词
Linear convergence; Alternating directions of multipliers; Error bound; Dual ascent; 49; 90;
D O I
暂无
中图分类号
学科分类号
摘要
We analyze the convergence rate of the alternating direction method of multipliers (ADMM) for minimizing the sum of two or more nonsmooth convex separable functions subject to linear constraints. Previous analysis of the ADMM typically assumes that the objective function is the sum of only two convex functions defined on two separable blocks of variables even though the algorithm works well in numerical experiments for three or more blocks. Moreover, there has been no rate of convergence analysis for the ADMM without strong convexity in the objective function. In this paper we establish the global R-linear convergence of the ADMM for minimizing the sum of any number of convex separable functions, assuming that a certain error bound condition holds true and the dual stepsize is sufficiently small. Such an error bound condition is satisfied for example when the feasible set is a compact polyhedron and the objective function consists of a smooth strictly convex function composed with a linear mapping, and a nonsmooth ℓ1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document} regularizer. This result implies the linear convergence of the ADMM for contemporary applications such as LASSO without assuming strong convexity of the objective function.
引用
收藏
页码:165 / 199
页数:34
相关论文
共 50 条