Fast Augmented Lagrangian Method in the convex regime with convergence guarantees for the iterates

被引:0
|
作者
Radu Ioan Boţ
Ernö Robert Csetnek
Dang-Khoa Nguyen
机构
[1] University of Vienna,Faculty of Mathematics
来源
Mathematical Programming | 2023年 / 200卷
关键词
Augmented Lagrangian Method; Primal-dual numerical algorithm; Nesterov’s fast gradient method; Convergence rates; Iterates convergence; 49M29; 65K05; 68Q25; 90C25; 65B99;
D O I
暂无
中图分类号
学科分类号
摘要
This work aims to minimize a continuously differentiable convex function with Lipschitz continuous gradient under linear equality constraints. The proposed inertial algorithm results from the discretization of the second-order primal-dual dynamical system with asymptotically vanishing damping term addressed by Boţ and Nguyen (J. Differential Equations 303:369–406, 2021), and it is formulated in terms of the Augmented Lagrangian associated with the minimization problem. The general setting we consider for the inertial parameters covers the three classical rules by Nesterov, Chambolle–Dossal and Attouch–Cabot used in the literature to formulate fast gradient methods. For these rules, we obtain in the convex regime convergence rates of order O1/k2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\mathcal {O}}\left( 1/k^{2} \right) $$\end{document} for the primal-dual gap, the feasibility measure, and the objective function value. In addition, we prove that the generated sequence of primal-dual iterates converges to a primal-dual solution in a general setting that covers the two latter rules. This is the first result which provides the convergence of the sequence of iterates generated by a fast algorithm for linearly constrained convex optimization problems without additional assumptions such as strong convexity. We also emphasize that all convergence results of this paper are compatible with the ones obtained in Boţ and Nguyen (J. Differential Equations 303:369–406, 2021) in the continuous setting.
引用
收藏
页码:147 / 197
页数:50
相关论文
共 50 条
  • [1] Fast Augmented Lagrangian Method in the convex regime with convergence guarantees for the iterates
    Bot, Radu Ioan
    Csetnek, Erno Robert
    Dang-Khoa Nguyen
    MATHEMATICAL PROGRAMMING, 2023, 200 (01) : 147 - 197
  • [3] On the convergence of inexact augmented Lagrangian methods for problems with convex constraints
    Galvan, Giulio
    Lapucci, Matteo
    OPERATIONS RESEARCH LETTERS, 2019, 47 (03) : 185 - 189
  • [4] Convergence rate of the augmented Lagrangian SQP method
    Kleis, D
    Sachs, EW
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1997, 95 (01) : 49 - 74
  • [5] Improving ultimate convergence of an augmented Lagrangian method
    Birgin, E. G.
    Martinez, J. M.
    OPTIMIZATION METHODS & SOFTWARE, 2008, 23 (02): : 177 - 195
  • [6] Convergence Rate of the Augmented Lagrangian SQP Method
    D. Kleis
    E. W. Sachs
    Journal of Optimization Theory and Applications, 1997, 95 : 49 - 74
  • [7] On scaled stopping criteria for a safeguarded augmented Lagrangian method with theoretical guarantees
    R. Andreani
    G. Haeser
    M. L. Schuverdt
    L. D. Secchin
    P. J. S. Silva
    Mathematical Programming Computation, 2022, 14 : 121 - 146
  • [8] On the Convergence of a Distributed Augmented Lagrangian Method for Nonconvex Optimization
    Chatzipanagiotis, Nikolaos
    Zavlanos, Michael M.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (09) : 4405 - 4420
  • [9] New Convergence Properties of the Primal Augmented Lagrangian Method
    Zhou, Jinchuan
    Zhu, Xunzhi
    Pan, Lili
    Zhao, Wenling
    ABSTRACT AND APPLIED ANALYSIS, 2011,
  • [10] Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
    Andrei Patrascu
    Ion Necoara
    Quoc Tran-Dinh
    Optimization Letters, 2017, 11 : 609 - 626