Stochastic first-order methods for convex and nonconvex functional constrained optimization

被引:19
|
作者
Boob, Digvijay [1 ]
Deng, Qi [2 ]
Lan, Guanghui [1 ]
机构
[1] Georgia Inst Technol, Ind & Syst Engn, Atlanta, GA 30332 USA
[2] Shanghai Univ Finance & Econ, Sch Informat Management & Engn, Shanghai, Peoples R China
关键词
Functional constrained optimization; Stochastic algorithms; Convex and nonconvex optimization; Acceleration; ALGORITHMS;
D O I
10.1007/s10107-021-01742-y
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Functional constrained optimization is becoming more and more important in machine learning and operations research. Such problems have potential applications in risk-averse machine learning, semisupervised learning and robust optimization among others. In this paper, we first present a novel Constraint Extrapolation (ConEx) method for solving convex functional constrained problems, which utilizes linear approximations of the constraint functions to define the extrapolation (or acceleration) step. We show that this method is a unified algorithm that achieves the best-known rate of convergence for solving different functional constrained convex composite problems, including convex or strongly convex, and smooth or nonsmooth problems with stochastic objective and/or stochastic constraints. Many of these rates of convergence were in fact obtained for the first time in the literature. In addition, ConEx is a single-loop algorithm that does not involve any penalty subproblems. Contrary to existing primal-dual methods, it does not require the projection of Lagrangian multipliers into a (possibly unknown) bounded set. Second, for nonconvex functional constrained problems, we introduce a new proximal point method which transforms the initial nonconvex problem into a sequence of convex problems by adding quadratic terms to both the objective and constraints. Under certain MFCQ-type assumption, we establish the convergence and rate of convergence of this method to KKT points when the convex subproblems are solved exactly or inexactly. For large-scale and stochastic problems, we present a more practical proximal point method in which the approximate solutions of the subproblems are computed by the aforementioned ConEx method. Under a strong feasibility assumption, we establish the total iteration complexity of ConEx required by this inexact proximal point method for a variety of problem settings, including nonconvex smooth or nonsmooth problems with stochastic objective and/or stochastic constraints. To the best of our knowledge, most of these convergence and complexity results of the proximal point method for nonconvex problems also seem to be new in the literature.
引用
收藏
页码:215 / 279
页数:65
相关论文
共 50 条
  • [1] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Digvijay Boob
    Qi Deng
    Guanghui Lan
    [J]. Mathematical Programming, 2023, 197 : 215 - 279
  • [2] Convergence of First-Order Methods for Constrained Nonconvex Optimization with Dependent Data
    Alacaoglu, Ahmet
    Lyu, Hanbaek
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202 : 458 - 489
  • [3] First-Order Methods for Convex Optimization
    Dvurechensky, Pavel
    Shtern, Shimrit
    Staudigl, Mathias
    [J]. EURO JOURNAL ON COMPUTATIONAL OPTIMIZATION, 2021, 9
  • [4] Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
    Zhou, Pan
    Yuan, Xiao-Tong
    Yan, Shuicheng
    Feng, Jiashi
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (02) : 459 - 472
  • [5] Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
    Zhou, Pan
    Yuan, Xiao-Tong
    Feng, Jiashi
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 138 - 147
  • [6] A Random Walk Approach to First-Order Stochastic Convex Optimization
    Vakili, Sattar
    Zhao, Qing
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 395 - 399
  • [7] A new Lagrangian-based first-order method for nonconvex constrained optimization
    Kim, Jong Gwang
    [J]. OPERATIONS RESEARCH LETTERS, 2023, 51 (03) : 357 - 363
  • [8] Stochastic First-Order Algorithms for Constrained Distributionally Robust Optimization
    Im, Hyungki
    Grigas, Paul
    [J]. INFORMS JOURNAL ON COMPUTING, 2024,
  • [9] SLM: A Smoothed First-Order Lagrangian Method for Structured Constrained Nonconvex Optimization
    Lu, Songtao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [10] A first-order multigrid method for bound-constrained convex optimization
    Kocvara, Michal
    Mohammed, Sudaba
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2016, 31 (03): : 622 - 644