Randomized Primal-Dual Proximal Block Coordinate Updates

被引:14
|
作者
Gao, Xiang [1 ]
Xu, Yang-Yang [2 ]
Zhang, Shu-Zhong [1 ]
机构
[1] Univ Minnesota, Dept Ind & Syst Engn, Minneapolis, MN USA
[2] Rensselaer Polytech Inst, Dept Math Sci, Troy, NY 12180 USA
基金
美国国家科学基金会;
关键词
Primal-dual method; Alternating direction method of multipliers (ADMM); Randomized algorithm; Iteration complexity; First-order stochastic approximation;
D O I
10.1007/s40305-018-0232-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we propose a randomized primal-dual proximal block coordinate updating framework for a general multi-block convex optimization model with coupled objective function and linear constraints. Assuming mere convexity, we establish its O(1/t) convergence rate in terms of the objective value and feasibility measure. The framework includes several existing algorithms as special cases such as a primal-dual method for bilinear saddle-point problems (PD-S), the proximal Jacobian alternating direction method of multipliers (Prox-JADMM) and a randomized variant of the ADMM for multi-block convex optimization. Our analysis recovers and/or strengthens the convergence properties of several existing algorithms. For example, for PD-S our result leads to the same order of convergence rate without the previously assumed boundedness condition on the constraint sets, and for Prox-JADMM the new result provides convergence rate in terms of the objective value and the feasibility violation. It is well known that the original ADMM may fail to converge when the number of blocks exceeds two. Our result shows that if an appropriate randomization procedure is invoked to select the updating blocks, then a sublinear rate of convergence in expectation can be guaranteed for multi-block ADMM, without assuming any strong convexity. The new approach is also extended to solve problems where only a stochastic approximation of the subgradient of the objective is available, and we establish an O(1/convergence rate of the extended approach for solving stochastic programming.
引用
收藏
页码:205 / 250
页数:46
相关论文
共 50 条
  • [31] Dynamic updates of the barrier parameter in primal-dual methods for nonlinear programming
    Armand, Paul
    Benoist, Joel
    Orban, Dominique
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2008, 41 (01) : 1 - 25
  • [32] On a primal-dual Newton proximal method for convex quadratic programs
    De Marchi, Alberto
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 81 (02) : 369 - 395
  • [33] A CLASS OF RANDOMIZED PRIMAL-DUAL ALGORITHMS FOR DISTRIBUTED OPTIMIZATION
    Pesquet, Jean-Christophe
    Repetti, Audrey
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2015, 16 (12) : 2453 - 2490
  • [34] Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
    Zhang, Yuchen
    Xiao, Lin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 353 - 361
  • [35] Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
    Zhang, Yuchen
    Xiao, Lin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [36] A Coordinate Descent Primal-Dual Algorithm and Application to Distributed Asynchronous Optimization
    Bianchi, Pascal
    Hachem, Walid
    Iutzeler, Franck
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2016, 61 (10) : 2947 - 2957
  • [37] PPD: A Scalable and Efficient Parallel Primal-Dual Coordinate Descent Algorithm
    Wu, Hejun
    Huang, Xinchuan
    Luo, Qiong
    Yang, Zhongheng
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (04) : 1958 - 1966
  • [38] Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization
    Alacaoglu, Ahmet
    Tran-Dinh, Quoc
    Fercoq, Olivier
    Cevher, Volkan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [39] ON PRIMAL-DUAL ALGORITHMS
    BELL, EJ
    JENNINGS, C
    COMMUNICATIONS OF THE ACM, 1966, 9 (09) : 653 - &
  • [40] Block Decomposition Methods for Total Variation by Primal-Dual Stitching
    Lee, Chang-Ock
    Lee, Jong Ho
    Woo, Hyenkyun
    Yun, Sangwoon
    JOURNAL OF SCIENTIFIC COMPUTING, 2016, 68 (01) : 273 - 302