Tight global linear convergence rate bounds for Douglas-Rachford splitting

被引:43
|
作者
Giselsson, Pontus [1 ]
机构
[1] Lund Univ, Dept Automat Control, Box 118, SE-22100 Lund, Sweden
关键词
Douglas-Rachford splitting; Linear convergence; Monotone operators; Fixed-point iterations; ALTERNATING DIRECTION METHOD; PROJECTIONS; MULTIPLIERS; ALGORITHMS; ADMM;
D O I
10.1007/s11784-017-0417-1
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Recently, several authors have shown local and global convergence rate results for Douglas-Rachford splitting under strong monotonicity, Lipschitz continuity, and cocoercivity assumptions. Most of these focus on the convex optimization setting. In the more general monotone inclusion setting, Lions and Mercier showed a linear convergence rate bound under the assumption that one of the two operators is strongly monotone and Lipschitz continuous. We show that this bound is not tight, meaning that no problem from the considered class converges exactly with that rate. In this paper, we present tight global linear convergence rate bounds for that class of problems. We also provide tight linear convergence rate bounds under the assumptions that one of the operators is strongly monotone and cocoercive, and that one of the operators is strongly monotone and the other is cocoercive. All our linear convergence results are obtained by proving the stronger property that the Douglas-Rachford operator is contractive.
引用
收藏
页码:2241 / 2270
页数:30
相关论文
共 50 条
  • [21] CONVERGENCE ANALYSIS OF THE RELAXED DOUGLAS-RACHFORD ALGORITHM
    Luke, D. Russell
    Martins, Anna-Lena
    SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 542 - 584
  • [22] On the convergence rate of Douglas–Rachford operator splitting method
    Bingsheng He
    Xiaoming Yuan
    Mathematical Programming, 2015, 153 : 715 - 722
  • [23] Diagonal Scaling in Douglas-Rachford Splitting and ADMM
    Giselsson, Pontus
    Boyd, Stephen
    2014 IEEE 53RD ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2014, : 5033 - 5039
  • [24] Removing Multiplicative Noise by Douglas-Rachford Splitting Methods
    G. Steidl
    T. Teuber
    Journal of Mathematical Imaging and Vision, 2010, 36 : 168 - 184
  • [25] Removing Multiplicative Noise by Douglas-Rachford Splitting Methods
    Steidl, G.
    Teuber, T.
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2010, 36 (02) : 168 - 184
  • [26] A customized Douglas-Rachford splitting algorithm for separable convex minimization with linear constraints
    Han, Deren
    He, Hongjin
    Yang, Hai
    Yuan, Xiaoming
    NUMERISCHE MATHEMATIK, 2014, 127 (01) : 167 - 200
  • [27] ON THE O(1/n) CONVERGENCE RATE OF THE DOUGLAS-RACHFORD ALTERNATING DIRECTION METHOD
    He, Bingsheng
    Yuan, Xiaoming
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2012, 50 (02) : 700 - 709
  • [28] Douglas-Rachford splitting and ADMM for pathological convex optimization
    Ryu, Ernest K.
    Liu, Yanli
    Yin, Wotao
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2019, 74 (03) : 747 - 778
  • [29] Accelerated ADMM based on Accelerated Douglas-Rachford Splitting
    Pejcic, Ivan
    Jones, Colin N.
    2016 EUROPEAN CONTROL CONFERENCE (ECC), 2016, : 1952 - 1957
  • [30] A parameterized Douglas-Rachford splitting algorithm for nonconvex optimization
    Bian, Fengmiao
    Zhang, Xiaoqun
    APPLIED MATHEMATICS AND COMPUTATION, 2021, 410