Composite convex optimization with global and local inexact oracles

被引:4
|
作者
Sun, Tianxiao [1 ]
Necoara, Ion [2 ]
Quoc Tran-Dinh [1 ]
机构
[1] Univ North Carolina Chapel Hill, Dept Stat & Operat Res, 333 Hanes Hall,CB 3260, Chapel Hill, NC 27599 USA
[2] Univ Politehn Bucuresti, Dept Automat Control & Syst Engn, Spl Independentei 313, Bucharest 060042, Romania
基金
美国国家科学基金会;
关键词
Self-concordant functions; Composite convex minimization; Local and global inexact oracles; Inexact proximal Newton-type method; Primal-dual second-order method;
D O I
10.1007/s10589-020-00174-2
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We introduce new global and local inexact oracle concepts for a wide class of convex functions in composite convex minimization. Such inexact oracles naturally arise in many situations, including primal-dual frameworks, barrier smoothing, and inexact evaluations of gradients and Hessians. We also provide examples showing that the class of convex functions equipped with the newly inexact oracles is larger than standard self-concordant and Lipschitz gradient function classes. Further, we investigate several properties of convex and/or self-concordant functions under our inexact oracles which are useful for algorithmic development. Next, we apply our theory to develop inexact proximal Newton-type schemes for minimizing general composite convex optimization problems equipped with such inexact oracles. Our theoretical results consist of new optimization algorithms accompanied with global convergence guarantees to solve a wide class of composite convex optimization problems. When the first objective term is additionally self-concordant, we establish different local convergence results for our method. In particular, we prove that depending on the choice of accuracy levels of the inexact second-order oracles, we obtain different local convergence rates ranging from linear and superlinear to quadratic. In special cases, where convergence bounds are known, our theory recovers the best known rates. We also apply our settings to derive a new primal-dual method for composite convex minimization problems involving linear operators. Finally, we present some representative numerical examples to illustrate the benefit of the new algorithms.
引用
收藏
页码:69 / 124
页数:56
相关论文
共 50 条
  • [41] Cutting-set methods for robust convex optimization with pessimizing oracles
    Mutapcic, Almir
    Boyd, Stephen
    OPTIMIZATION METHODS & SOFTWARE, 2009, 24 (03): : 381 - 406
  • [42] Information-theoretic lower bounds for convex optimization with erroneous oracles
    Singer, Yaron
    Vondrak, Jan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [43] Smooth Convex Optimization Using Sub-Zeroth-Order Oracles
    Karabag, Mustafa O.
    Neary, Cyrus
    Topcu, Ufuk
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 3815 - 3822
  • [44] On the equivalence of inexact proximal ALM and ADMM for a class of convex composite programming
    Liang Chen
    Xudong Li
    Defeng Sun
    Kim-Chuan Toh
    Mathematical Programming, 2021, 185 : 111 - 161
  • [45] On the equivalence of inexact proximal ALM and ADMM for a class of convex composite programming
    Chen, Liang
    Li, Xudong
    Sun, Defeng
    Toh, Kim-Chuan
    MATHEMATICAL PROGRAMMING, 2021, 185 (1-2) : 111 - 161
  • [46] Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence
    Xu, Yi
    Lin, Qihang
    Yang, Tianbao
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [47] An inexact version of the symmetric proximal ADMM for solving separable convex optimization
    Vando A. Adona
    Max L. N. Gonçalves
    Numerical Algorithms, 2023, 94 : 1 - 28
  • [48] A new inexact gradient descent method with applications to nonsmooth convex optimization
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Tran, Dat Ba
    OPTIMIZATION METHODS & SOFTWARE, 2024,
  • [49] A New Inexact Gradient Descent Method with Applications to Nonsmooth Convex Optimization
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Tran, Dat Ba
    arXiv, 2023,
  • [50] Inexact variable metric method for convex-constrained optimization problems
    Goncalves, Douglas S.
    Goncalves, Max L. N.
    Menezes, Tiago C.
    OPTIMIZATION, 2022, 71 (01) : 145 - 163