Accelerated Zeroth-order Method for Non-Smooth Stochastic Convex Optimization Problem with Infinite Variance

被引:0
|
作者
Kornilov, Nikita [1 ,2 ]
Shamir, Ohad [3 ]
Lobanov, Aleksandr [1 ,4 ]
Dvinskikh, Darina [4 ,5 ]
Gasnikov, Alexander [1 ,2 ,4 ]
Shibaev, Innokentiy [1 ,6 ]
Gorbunov, Eduard [7 ]
Horvath, Samuel [7 ]
机构
[1] MIPT, Dolgoprudnyi, Russia
[2] SkolTech, Moscow, Russia
[3] Weizmann Inst Sci, Rehovot, Israel
[4] RAS, ISP, Moscow, Russia
[5] HSE Univ, Moscow, Russia
[6] RAS, IITP, Moscow, Russia
[7] MBZUAI, Abu Dhabi, U Arab Emirates
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we consider non-smooth stochastic convex optimization with two function evaluations per round under infinite noise variance. In the classical setting when noise has finite variance, an optimal algorithm, built upon the batched accelerated gradient method, was proposed in [17]. This optimality is defined in terms of iteration and oracle complexity, as well as the maximal admissible level of adversarial noise. However, the assumption of finite variance is burdensome and it might not hold in many practical scenarios. To address this, we demonstrate how to adapt a refined clipped version of the accelerated gradient (Stochastic Similar Triangles) method from [35] for a two-point zero-order oracle. This adaptation entails extending the batching technique to accommodate infinite variance - a non-trivial task that stands as a distinct contribution of this paper.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Zeroth-Order Random Subspace Algorithm for Non-smooth Convex Optimization
    Nozawa, Ryota
    Poirion, Pierre-Louis
    Takeda, Akiko
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2025, 204 (03)
  • [2] Accelerated Zeroth-Order Algorithm for Stochastic Distributed Non-Convex Optimization
    Zhang, Shengjun
    Bailey, Colleen P.
    2022 AMERICAN CONTROL CONFERENCE, ACC, 2022, : 4274 - 4279
  • [3] A ZEROTH-ORDER PROXIMAL STOCHASTIC GRADIENT METHOD FOR WEAKLY CONVEX STOCHASTIC OPTIMIZATION
    Pougkakiotis, Spyridon
    Kalogerias, Dionysis
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (05): : A2679 - A2702
  • [4] Zeroth-Order Stochastic Variance Reduction for Nonconvex Optimization
    Liu, Sijia
    Kailkhura, Bhavya
    Chen, Pin-Yu
    Ting, Paishun
    Chang, Shiyu
    Amini, Lisa
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Relatively accelerated stochastic gradient algorithm for a class of non-smooth convex optimization problem
    Zhang, Wenjuan
    Feng, Xiangchu
    Xiao, Feng
    Huang, Shujuan
    Li, Huan
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (03): : 147 - 157
  • [6] A Generic Approach for Accelerating Stochastic Zeroth-Order Convex Optimization
    Yu, Xiaotian
    King, Irwin
    Lyu, Michael R.
    Yang, Tianbao
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 3040 - 3046
  • [7] Zeroth-order (Non)-Convex Stochastic Optimization via Conditional Gradient and Gradient Updates
    Balasubramanian, Krishnakumar
    Ghadimi, Saeed
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] A zeroth order method for stochastic weakly convex optimization
    Kungurtsev, V
    Rinaldi, F.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 80 (03) : 731 - 753
  • [9] A zeroth order method for stochastic weakly convex optimization
    V. Kungurtsev
    F. Rinaldi
    Computational Optimization and Applications, 2021, 80 : 731 - 753
  • [10] Stochastic Zeroth-order Optimization in High Dimensions
    Wang, Yining
    Du, Simon S.
    Balakrishnan, Sivaraman
    Singh, Aarti
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84