A feasible smoothing accelerated projected gradient method for nonsmooth convex optimization

被引:0
|
作者
Nishioka, Akatsuki [1 ]
Kanno, Yoshihiro [1 ,2 ]
机构
[1] Univ Tokyo, Dept Math Informat, Bunkyo Ku, Hongo 7-3-1,Bunkyo Ku, Tokyo 1138656, Japan
[2] Univ Tokyo, Math & Informat Ctr, Hongo 7-3-1,Bunkyo Ku, Tokyo 1138656, Japan
关键词
Smoothing method; Accelerated gradient method; Convergence rate; Structural optimization; Eigenvalue optimization;
D O I
10.1016/j.orl.2024.107181
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Smoothing accelerated gradient methods achieve faster convergence rates than that of the subgradient method for some nonsmooth convex optimization problems. However, Nesterov's extrapolation may require gradients at infeasible points, and thus they cannot be applied to some structural optimization problems. We introduce a variant of smoothing accelerated projected gradient methods where every variable is feasible. The O ( k - 1 log k ) convergence rate is obtained using the Lyapunov function. We conduct a numerical experiment on the robust compliance optimization of a truss structure. (c) 2024 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons .org /licenses /by /4 .0/).
引用
收藏
页数:5
相关论文
共 50 条
  • [31] A SMOOTHING PROXIMAL GRADIENT ALGORITHM FOR NONSMOOTH CONVEX REGRESSION WITH CARDINALITY PENALTY
    Bian, Wei
    Chen, Xiaojun
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 2020, 58 (01) : 858 - 883
  • [32] Primal-dual incremental gradient method for nonsmooth and convex optimization problems
    Afrooz Jalilzadeh
    [J]. Optimization Letters, 2021, 15 : 2541 - 2554
  • [33] Primal-dual incremental gradient method for nonsmooth and convex optimization problems
    Jalilzadeh, Afrooz
    [J]. OPTIMIZATION LETTERS, 2021, 15 (08) : 2541 - 2554
  • [34] New method for nonsmooth convex optimization
    Wei, Z
    Qi, L
    Birge, JR
    [J]. JOURNAL OF INEQUALITIES AND APPLICATIONS, 1998, 2 (02) : 157 - 179
  • [35] Accelerated Distributed Projected Gradient Descent for Convex Optimization with Clique-wise Coupled Constraints
    Watanabe, Yuto
    Sakurama, Kazunori
    [J]. IFAC PAPERSONLINE, 2023, 56 (02): : 3465 - 3470
  • [36] ITERATED SMOOTHING FOR ACCELERATED GRADIENT CONVEX MINIMIZATION IN SIGNAL PROCESSING
    Jensen, Tobias Lindstrom
    Ostergaard, Jan
    Jensen, Soren Holdt
    [J]. 2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 774 - 777
  • [37] An accelerated minimal gradient method with momentum for strictly convex quadratic optimization
    Oviedo, Harry
    Dalmau, Oscar
    Herrera, Rafael
    [J]. BIT NUMERICAL MATHEMATICS, 2022, 62 (02) : 591 - 606
  • [38] A unified variance-reduced accelerated gradient method for convex optimization
    Lan, Guanghui
    Li, Zhize
    Zhou, Yi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [39] An accelerated minimal gradient method with momentum for strictly convex quadratic optimization
    Harry Oviedo
    Oscar Dalmau
    Rafael Herrera
    [J]. BIT Numerical Mathematics, 2022, 62 : 591 - 606
  • [40] MGPROX: A NONSMOOTH MULTIGRID PROXIMAL GRADIENT METHOD WITH ADAPTIVE RESTRICTION FOR STRONGLY CONVEX OPTIMIZATION
    Ang, Andersen
    de Sterck, Hans
    Vavasis, Stephen
    [J]. SIAM Journal on Optimization, 2024, 34 (03) : 2788 - 2820