Convergence rates of subgradient methods for quasi-convex optimization problems

被引:0
|
作者
Yaohua Hu
Jiawen Li
Carisa Kwok Wai Yu
机构
[1] Shenzhen University,Shenzhen Key Laboratory of Advanced Machine Learning and Applications, College of Mathematics and Statistics
[2] Shenzhen University,College of Mathematics and Statistics
[3] The Hang Seng University of Hong Kong,Department of Mathematics, Statistics and Insurance
关键词
Quasi-convex programming; Subgradient method; Iteration complexity; Convergence rates; Primary: 65K05; 90C26; Secondary: 49M37;
D O I
暂无
中图分类号
学科分类号
摘要
Quasi-convex optimization acts a pivotal part in many fields including economics and finance; the subgradient method is an effective iterative algorithm for solving large-scale quasi-convex optimization problems. In this paper, we investigate the quantitative convergence theory, including the iteration complexity and convergence rates, of various subgradient methods for solving quasi-convex optimization problems in a unified framework. In particular, we consider a sequence satisfying a general (inexact) basic inequality, and investigate the global convergence theorem and the iteration complexity when using the constant, diminishing or dynamic stepsize rules. More importantly, we establish the linear (or sublinear) convergence rates of the sequence under an additional assumption of weak sharp minima of Hölderian order and upper bounded noise. These convergence theorems are applied to establish the iteration complexity and convergence rates of several subgradient methods, including the standard/inexact/conditional subgradient methods, for solving quasi-convex optimization problems under the assumptions of the Hölder condition and/or the weak sharp minima of Hölderian order.
引用
收藏
页码:183 / 212
页数:29
相关论文
共 50 条
  • [1] Convergence rates of subgradient methods for quasi-convex optimization problems
    Hu, Yaohua
    Li, Jiawen
    Yu, Carisa Kwok Wai
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2020, 77 (01) : 183 - 212
  • [2] Quasi-convex feasibility problems: Subgradient methods and convergence rates
    Hu, Yaohua
    Li, Gongnong
    Yu, Carisa Kwok Wai
    Yip, Tsz Leung
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2022, 298 (01) : 45 - 58
  • [3] Inexact subgradient methods for quasi-convex optimization problems
    Hu, Yaohua
    Yang, Xiaoqi
    Sim, Chee-Khian
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2015, 240 (02) : 315 - 327
  • [4] CONDITIONAL SUBGRADIENT METHODS FOR CONSTRAINED QUASI-CONVEX OPTIMIZATION PROBLEMS
    Hu, Yaohua
    Yu, Carisa Kwok Wai
    Li, Chong
    Yang, Xiaoqi
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2016, 17 (10) : 2143 - 2158
  • [5] STOCHASTIC SUBGRADIENT METHOD FOR QUASI-CONVEX OPTIMIZATION PROBLEMS
    Hu, Yaohua
    Yu, Carisa Kwok Wai
    Li, Chong
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2016, 17 (04) : 711 - 724
  • [6] QUASI-SUBGRADIENT METHODS WITH BREGMAN DISTANCE FOR QUASI-CONVEX FEASIBILITY PROBLEMS
    Hu, Yaohua
    Li, Jingchao
    Liu, Yanyan
    Yu, Carisa Kwok Wai
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2024, 8 (03): : 381 - 395
  • [7] Adaptive subgradient methods for mathematical programming problems with quasi-convex functions
    Ablaev, S. S.
    Stonyakin, F. S.
    Alkousa, M. S.
    Gasnikov, A., V
    TRUDY INSTITUTA MATEMATIKI I MEKHANIKI URO RAN, 2023, 29 (03): : 7 - 25
  • [8] MULTIPLE-SETS SPLIT QUASI-CONVEX FEASIBILITY PROBLEMS: ADAPTIVE SUBGRADIENT METHODS WITH CONVERGENCE GUARANTEE
    Hu, Yaohua
    Li, Gang
    Li, Minghua
    Yu, Carisa Kwok Wai
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2022, 6 (02): : 15 - 33
  • [9] Abstract convergence theorem for quasi-convex optimization problems with applications
    Yu, Carisa Kwok Wai
    Hu, Yaohua
    Yang, Xiaoqi
    Choy, Siu Kai
    OPTIMIZATION, 2019, 68 (07) : 1289 - 1304
  • [10] STOCHASTIC QUASI-SUBGRADIENT METHOD FOR STOCHASTIC QUASI-CONVEX FEASIBILITY PROBLEMS
    LI, Gang
    LI, Minghua
    Hu, Yaohua
    DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS-SERIES S, 2022, 15 (04): : 713 - 725