A fully distributed dual gradient method with linear convergence for large-scale separable convex problems

被引:0
|
作者
Necoara, Ion [1 ]
Nedich, Angelia [2 ]
机构
[1] Univ Politehn Bucuresti, Automat Control & Syst Engn Dept, Bucharest, Romania
[2] Univ Illinois, Ind & Enterprise Syst Engn Dept, Urbana, IL 61801 USA
关键词
MODEL-PREDICTIVE CONTROL; DECOMPOSITION; OPTIMIZATION; COMMUNICATION; ALGORITHM; SYSTEMS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper we propose a distributed dual gradient algorithm for minimizing linearly constrained separable convex problems and analyze its rate of convergence. In particular, we show that under the assumption that the Hessian of the primal objective function is bounded we have a global error bound type property for the dual problem. Using this error bound property we devise a fully distributed dual gradient scheme for which we derive global linear rate of convergence. The proposed dual gradient method is fully distributed, requiring only local information, since is based on a weighted stepsize. Our method can be applied in many applications, e.g. distributed model predictive control, network utility maximization or optimal power flow.
引用
收藏
页码:304 / 309
页数:6
相关论文
共 50 条
  • [1] On linear convergence of a distributed dual gradient algorithm for linearly constrained separable convex problems
    Necoara, Ion
    Nedelcu, Valentin
    [J]. AUTOMATICA, 2015, 55 : 209 - 216
  • [2] A dual gradient-projection method for large-scale strictly convex quadratic problems
    Gould, Nicholas I. M.
    Robinson, Daniel P.
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2017, 67 (01) : 1 - 38
  • [3] A dual gradient-projection method for large-scale strictly convex quadratic problems
    Nicholas I. M. Gould
    Daniel P. Robinson
    [J]. Computational Optimization and Applications, 2017, 67 : 1 - 38
  • [4] A Conjugate Gradient Method with Global Convergence for Large-Scale Unconstrained Optimization Problems
    Yao, Shengwei
    Lu, Xiwen
    Wei, Zengxin
    [J]. JOURNAL OF APPLIED MATHEMATICS, 2013,
  • [5] A DOUBLE INCREMENTAL AGGREGATED GRADIENT METHOD WITH LINEAR CONVERGENCE RATE FOR LARGE-SCALE OPTIMIZATION
    Mokhtari, Aryan
    Gurbuzbalaban, Mert
    Ribeiro, Alejandro
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4696 - 4700
  • [6] Randomized dual proximal gradient for large-scale distributed optimization
    Notarnicola, Ivano
    Notarstefano, Giuseppe
    [J]. 2015 54TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2015, : 712 - 717
  • [7] Practical Incremental Gradient Method for Large-Scale Problems
    Huang, Junchu
    Zhou, Zhiheng
    Wang, Yifan
    Yang, Zhiwei
    [J]. PROCEEDINGS OF TENCON 2018 - 2018 IEEE REGION 10 CONFERENCE, 2018, : 1845 - 1848
  • [8] On the Distributed Method of Multipliers for Separable Convex Optimization Problems
    Sherson, Thomas
    Heusdens, Richard
    Kleijn, W. Bastiaan
    [J]. IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2019, 5 (03): : 495 - 510
  • [9] On the Linear Convergence of a Proximal Gradient Method for a Class of Nonsmooth Convex Minimization Problems
    Zhang, Haibin
    Jiang, Jiaojiao
    Luo, Zhi-Quan
    [J]. JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2013, 1 (02) : 163 - 186
  • [10] Solving Large-Scale Linear Circuit Problems via Convex Optimization
    Lavaei, Javad
    Babakhani, Aydin
    Hajimiri, Ali
    Doyle, John C.
    [J]. PROCEEDINGS OF THE 48TH IEEE CONFERENCE ON DECISION AND CONTROL, 2009 HELD JOINTLY WITH THE 2009 28TH CHINESE CONTROL CONFERENCE (CDC/CCC 2009), 2009, : 4977 - 4984