Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

被引:0
|
作者
Kulunchakov, Andrei [1 ]
Mairal, Julien [1 ]
机构
[1] Univ Grenoble Alpes, INRIA, Grenoble INP, CNRS,LJK, F-38000 Grenoble, France
基金
欧洲研究理事会;
关键词
convex optimization; variance reduction; stochastic optimization; APPROXIMATION ALGORITHMS; GRADIENT METHODS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. More precisely, we interpret a large class of stochastic optimization methods as procedures that iteratively minimize a surrogate of the objective, which covers the stochastic gradient descent method and variants of the incremental approaches SAGA, SVRG, and MISO/Finito/SDCA. This point of view has several advantages: (i) we provide a simple generic proof of convergence for all of the aforementioned methods; (ii) we naturally obtain new algorithms with the same guarantees; (iii) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we propose a new accelerated stochastic gradient descent algorithm and a new accelerated SVRG algorithm that is robust to stochastic noise.
引用
收藏
页数:52
相关论文
共 50 条
  • [1] Estimate sequences for stochastic composite optimization: Variance reduction, acceleration, and robustness to noise
    Kulunchakov, Andrei
    Mairal, Julien
    [J]. Journal of Machine Learning Research, 2020, 21
  • [2] Estimate Sequences for Variance-Reduced Stochastic Composite Optimization
    Kulunchakov, Andrei
    Mairal, Julien
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [3] VARIANCE REDUCTION-BOOSTED BYZANTINE ROBUSTNESS IN DECENTRALIZED STOCHASTIC OPTIMIZATION
    Peng, Jie
    Li, Weiyu
    Ling, Qing
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4283 - 4287
  • [4] On variance reduction for stochastic smooth convex optimization with multiplicative noise
    Alejandro Jofré
    Philip Thompson
    [J]. Mathematical Programming, 2019, 174 : 253 - 292
  • [5] On variance reduction for stochastic smooth convex optimization with multiplicative noise
    Jofre, Alejandro
    Thompson, Philip
    [J]. MATHEMATICAL PROGRAMMING, 2019, 174 (1-2) : 253 - 292
  • [6] MULTILEVEL COMPOSITE STOCHASTIC OPTIMIZATION VIA NESTED VARIANCE REDUCTION
    Zhang, Junyu
    Xiao, Lin
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (02) : 1131 - 1157
  • [7] Stochastic Variance Reduction for Nonconvex Optimization
    Reddi, Sashank J.
    Hefny, Ahmed
    Sra, Suvrit
    Poczos, Barnabas
    Smola, Alex
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [8] Stochastic Nested Variance Reduction for Nonconvex Optimization
    Zhou, Dongruo
    Xu, Pan
    Gu, Quanquan
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [9] Stochastic Nested Variance Reduction for Nonconvex Optimization
    Zhou, Dongruo
    Xu, Pan
    Gu, Quanquan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [10] A Generic Acceleration Framework for Stochastic Composite Optimization
    Kulunchakov, Andrei
    Mairal, Julien
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32