AN ASYMPTOTICALLY SUPERLINEARLY CONVERGENT SEMISMOOTH NEWTON AUGMENTED LAGRANGIAN METHOD FOR LINEAR PROGRAMMING

被引:20
|
作者
Li, Xudong [1 ,2 ]
Sun, Defeng [3 ]
Toh, Kim-Chuan [4 ,5 ]
机构
[1] Fudan Univ, Sch Data Sci, Shanghai, Peoples R China
[2] Fudan Univ, Shanghai Ctr Math Sci, Shanghai, Peoples R China
[3] Hong Kong Polytech Univ, Dept Appl Math, Hung Hom, Hong Kong, Peoples R China
[4] Natl Univ Singapore, Dept Math, Singapore, Singapore
[5] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore, Singapore
基金
中国国家自然科学基金;
关键词
linear programming; semismooth Newton method; augmented Lagrangian method; INTERIOR-POINT METHODS; PRECONDITIONING INDEFINITE SYSTEMS; ALGORITHM; MATRIX;
D O I
10.1137/19M1251795
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Powerful interior-point methods (IPM) based commercial solvers, such as Gurobi and Mosek, have been hugely successful in solving large-scale linear programming (LP) problems. The high efficiency of these solvers depends critically on the sparsity of the problem data and advanced matrix factorization techniques. For a large scale LP problem with data matrix A that is dense (possibly structured) or whose corresponding normal matrix AAT has a dense Cholesky factor (even with reordering), these solvers may require excessive computational cost and/or extremely heavy memory usage in each interior-point iteration. Unfortunately, the natural remedy, i.e., the use of iterative methods based IPM solvers, although it can avoid the explicit computation of the coefficient matrix and its factorization, is often not practically viable due to the inherent extreme ill-conditioning of the large scale normal equation arising in each interior-point iteration. While recent progress has been made to alleviate the ill-conditioning issue via sophisticated preconditioning techniques, the difficulty remains a challenging one. To provide a better alternative choice for solving large scale LPs with dense data or requiring expensive factorization of its normal equation, we propose a semismooth Newton based inexact proximal augmented Lagrangian (SNIPAL) method. Different from classical IPMs, in each iteration of SNIPAL, iterative methods can efficiently be used to solve simpler yet better conditioned semismooth Newton linear systems. Moreover, SNIPAL not only enjoys a fast asymptotic superlinear convergence but is also proven to enjoy a finite termination property. Numerical comparisons with Gurobi have demonstrated encouraging potential of SNIPAL for handling large-scale LP problems where the constraint matrix A has a dense representation or AAT has a dense factorization even with an appropriate reordering. For a few large LP instances arising from correlation clustering, our algorithm can be up to 20-100 times faster than the barrier method implemented in Gurobi for solving the problems to the accuracy of 10-8 in the relative KKT residual. However, when tested on some large sparse LP problems available in the public domain, our algorithm is not yet practically competitive against the barrier method in Gurobi, especially when the latter can compute the Schur complement matrix and its sparse Cholesky factorization in each iteration cheaply.
引用
收藏
页码:2410 / 2440
页数:31
相关论文
共 50 条
  • [31] An Augmented Lagrangian Primal-Dual Semismooth Newton Method for Multi-Block Composite Optimization
    Zhanwang Deng
    Kangkang Deng
    Jiang Hu
    Zaiwen Wen
    Journal of Scientific Computing, 2025, 102 (3)
  • [32] A Semismooth Newton Method for Fast, Generic Convex Programming
    Ali, Alnur
    Wong, Eric
    Kolter, J. Zico
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [33] A semismooth Newton method for nonlinear symmetric cone programming
    Kong, Lingchen
    Meng, Qingmin
    MATHEMATICAL METHODS OF OPERATIONS RESEARCH, 2012, 76 (02) : 129 - 145
  • [34] A semismooth Newton method for nonlinear symmetric cone programming
    Lingchen Kong
    Qingmin Meng
    Mathematical Methods of Operations Research, 2012, 76 : 129 - 145
  • [35] Augmented Lagrangian method for large-scale linear programming problems
    Evtushenko, YG
    Golikov, AI
    Mollaverdy, N
    OPTIMIZATION METHODS & SOFTWARE, 2005, 20 (4-5): : 515 - 524
  • [36] A Semismooth Newton-based Augmented Lagrangian Algorithm for Density Matrix Least Squares Problems
    Yong-Jin Liu
    Jing Yu
    Journal of Optimization Theory and Applications, 2022, 195 : 749 - 779
  • [37] A Semismooth Newton-based Augmented Lagrangian Algorithm for Density Matrix Least Squares Problems
    Liu, Yong-Jin
    Yu, Jing
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2022, 195 (03) : 749 - 779
  • [38] A superlinearly convergent nonmonotone quasi-Newton method for unconstrained multiobjective optimization
    Mahdavi-Amiri, N.
    Sadaghiani, F.
    OPTIMIZATION METHODS & SOFTWARE, 2020, 35 (06): : 1223 - 1247
  • [39] Some superlinearly convergent inexact generalized Newton method for solving nonsmooth equations
    Smietanski, Marek J.
    OPTIMIZATION METHODS & SOFTWARE, 2012, 27 (03): : 405 - 417
  • [40] AUGMENTED LAGRANGIAN ALGORITHMS FOR LINEAR-PROGRAMMING
    GULER, O
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1992, 75 (03) : 445 - 470