Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

被引:0
|
作者
Kulunchakov, Andrei [1 ]
Mairal, Julien [1 ]
机构
[1] Univ Grenoble Alpes, INRIA, Grenoble INP, CNRS,LJK, F-38000 Grenoble, France
基金
欧洲研究理事会;
关键词
convex optimization; variance reduction; stochastic optimization; APPROXIMATION ALGORITHMS; GRADIENT METHODS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. More precisely, we interpret a large class of stochastic optimization methods as procedures that iteratively minimize a surrogate of the objective, which covers the stochastic gradient descent method and variants of the incremental approaches SAGA, SVRG, and MISO/Finito/SDCA. This point of view has several advantages: (i) we provide a simple generic proof of convergence for all of the aforementioned methods; (ii) we naturally obtain new algorithms with the same guarantees; (iii) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we propose a new accelerated stochastic gradient descent algorithm and a new accelerated SVRG algorithm that is robust to stochastic noise.
引用
收藏
页数:52
相关论文
共 50 条
  • [31] A framework for bilevel optimization that enables stochastic and global variance reduction algorithms
    Dagreou, Mathieu
    Ablin, Pierre
    Vaiter, Samuel
    Moreau, Thomas
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [32] Stochastic Optimization with Variance Reduction for Infinite Datasets with Finite Sum Structure
    Bietti, Alberto
    Mairal, Julien
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [33] Stochastic efficient global optimization with high noise variance and mixed design variables
    Lopez, Rafael Holdorf
    Bismut, Elizabeth
    Straub, Daniel
    JOURNAL OF THE BRAZILIAN SOCIETY OF MECHANICAL SCIENCES AND ENGINEERING, 2023, 45 (01)
  • [34] Stochastic efficient global optimization with high noise variance and mixed design variables
    Rafael Holdorf Lopez
    Elizabeth Bismut
    Daniel Straub
    Journal of the Brazilian Society of Mechanical Sciences and Engineering, 2023, 45
  • [35] N-SVRG: Stochastic Variance Reduction Gradient with Noise Reduction Ability for Small Batch Samples
    Pan, Haijie
    Zheng, Lirong
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2022, 131 (01): : 493 - 512
  • [36] Cyclic Block Coordinate Descent With Variance Reduction for Composite Nonconvex Optimization
    Cai, Xufeng
    Song, Chaobing
    Wright, Stephen J.
    Diakonikolas, Jelena
    Proceedings of Machine Learning Research, 2023, 202 : 3469 - 3494
  • [37] Cyclic Block Coordinate Descent With Variance Reduction for Composite Nonconvex Optimization
    Cai, Xufeng
    Song, Chaobing
    Wright, Stephen J.
    Diakonikolas, Jelena
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [38] General inertial proximal stochastic variance reduction gradient for nonconvex nonsmooth optimization
    Sun, Shuya
    He, Lulu
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2023, 2023 (01)
  • [39] General inertial proximal stochastic variance reduction gradient for nonconvex nonsmooth optimization
    Shuya Sun
    Lulu He
    Journal of Inequalities and Applications, 2023
  • [40] Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization
    Zhang, Junyu
    Xiao, Lin
    MATHEMATICAL PROGRAMMING, 2022, 195 (1-2) : 649 - 691