ANDERSON ACCELERATED DOUGLAS-RACHFORD SPLITTING

被引:34
|
作者
Fu, Anqi [1 ]
Zhang, Junzi [2 ]
Boyd, Stephen [1 ]
机构
[1] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[2] Stanford Univ, ICME, Palo Alto, CA 94304 USA
来源
SIAM JOURNAL ON SCIENTIFIC COMPUTING | 2020年 / 42卷 / 06期
关键词
Anderson acceleration; nonsmooth convex optimization; parallel and distributed optimization; proximal oracles; stabilization; global convergence; pathological settings; ALTERNATING DIRECTION METHOD; CONVERGENCE; OPTIMIZATION; ALGORITHM; EQUATIONS;
D O I
10.1137/19M1290097
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider the problem of nonsmooth convex optimization with linear equality constraints, where the objective function is only accessible through its proximal operator. This problem arises in many different fields such as statistical learning, computational imaging, telecommunications, and optimal control. To solve it, we propose an Anderson accelerated Douglas-Rachford splitting (A2DR) algorithm, which we show either globally converges or provides a certificate of infeasibility/unboundedness under very mild conditions. Applied to a block separable objective, A2DR partially decouples so that its steps may be carried out in parallel, yielding an algorithm that is fast and scalable to multiple processors. We describe an open-source implementation and demonstrate its performance on a wide range of examples.
引用
收藏
页码:A3560 / A3583
页数:24
相关论文
共 50 条