Accelerated ADMM based on Accelerated Douglas-Rachford Splitting

被引:0
|
作者
Pejcic, Ivan [1 ]
Jones, Colin N. [1 ]
机构
[1] Ecole Polytech Fed Lausanne, Lab Automat, CH-1015 Lausanne, Switzerland
基金
欧洲研究理事会;
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Alternating direction method of multipliers (ADMM) is a form of augmented Lagrangian optimisation algorithm that found its place in many new applications in recent years. This paper explores a possibility for an upgrade of the ADMM by extrapolation-based acceleration, which has been successfully utilised for a long time in case of accelerated gradient method. The development uses a recently proposed accelerated Duglas-Rachford splitting by applying it on Fenchel dual problem, resulting in a method that replaces the classical proximal point convergence mechanism of ADMM with the accelerated gradient. The obtained method requires that the second function involved in the cost is strongly convex quadratic, as well as an upper bound on the penalty parameter. A heuristic modification of the derived method is described, and numerical experiments are performed by solving a randomly generated quadratic programming (QP) problem.
引用
收藏
页码:1952 / 1957
页数:6
相关论文
共 50 条
  • [1] ANDERSON ACCELERATED DOUGLAS-RACHFORD SPLITTING
    Fu, Anqi
    Zhang, Junzi
    Boyd, Stephen
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2020, 42 (06): : A3560 - A3583
  • [2] Douglas-Rachford Splitting: Complexity Estimates and Accelerated Variants
    Patrinos, Panagiotis
    Stella, Lorenzo
    Bemporad, Alberto
    [J]. 2014 IEEE 53RD ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2014, : 4234 - 4239
  • [3] Douglas-Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms
    Themelis, Andreas
    Stella, Lorenzo
    Patrinos, Panagiotis
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 82 (02) : 395 - 440
  • [4] Diagonal Scaling in Douglas-Rachford Splitting and ADMM
    Giselsson, Pontus
    Boyd, Stephen
    [J]. 2014 IEEE 53RD ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2014, : 5033 - 5039
  • [5] An accelerated variance reducing stochastic method with Douglas-Rachford splitting
    Liu, Jingchang
    Xu, Linli
    Shen, Shuheng
    Ling, Qing
    [J]. MACHINE LEARNING, 2019, 108 (05) : 859 - 878
  • [6] Anderson Acceleration for Nonconvex ADMM Based on Douglas-Rachford Splitting
    Ouyang, Wenqing
    Peng, Yue
    Yao, Yuxin
    Zhang, Juyong
    Deng, Bailin
    [J]. COMPUTER GRAPHICS FORUM, 2020, 39 (05) : 221 - 239
  • [7] An accelerated variance reducing stochastic method with Douglas-Rachford splitting
    Jingchang Liu
    Linli Xu
    Shuheng Shen
    Qing Ling
    [J]. Machine Learning, 2019, 108 : 859 - 878
  • [8] Douglas-Rachford splitting and ADMM for pathological convex optimization
    Ryu, Ernest K.
    Liu, Yanli
    Yin, Wotao
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2019, 74 (03) : 747 - 778
  • [9] Linear Convergence and Metric Selection for Douglas-Rachford Splitting and ADMM
    Giselsson, Pontus
    Boyd, Stephen
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (02) : 532 - 544
  • [10] DOUGLAS-RACHFORD SPLITTING AND ADMM FOR NONCONVEX OPTIMIZATION: TIGHT CONVERGENCE RESULTS
    Themelis, Andreas
    Patrinos, Panagiotis
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 149 - 181