Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

被引:0
|
作者
Kulunchakov, Andrei [1 ]
Mairal, Julien [1 ]
机构
[1] Univ Grenoble Alpes, INRIA, Grenoble INP, CNRS,LJK, F-38000 Grenoble, France
基金
欧洲研究理事会;
关键词
convex optimization; variance reduction; stochastic optimization; APPROXIMATION ALGORITHMS; GRADIENT METHODS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. More precisely, we interpret a large class of stochastic optimization methods as procedures that iteratively minimize a surrogate of the objective, which covers the stochastic gradient descent method and variants of the incremental approaches SAGA, SVRG, and MISO/Finito/SDCA. This point of view has several advantages: (i) we provide a simple generic proof of convergence for all of the aforementioned methods; (ii) we naturally obtain new algorithms with the same guarantees; (iii) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we propose a new accelerated stochastic gradient descent algorithm and a new accelerated SVRG algorithm that is robust to stochastic noise.
引用
收藏
页数:52
相关论文
共 50 条
  • [41] Adaptive Proximal Average Based Variance Reducing Stochastic Methods for Optimization with Composite Regularization
    Liu, Jingchang
    Xu, Linli
    Guo, Junliang
    Sheng, Xin
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1552 - 1559
  • [42] Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization
    Junyu Zhang
    Lin Xiao
    Mathematical Programming, 2022, 195 : 649 - 691
  • [43] Momentum-Based Variance-Reduced Proximal Stochastic Gradient Method for Composite Nonconvex Stochastic Optimization
    Xu, Yangyang
    Xu, Yibo
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2023, 196 (01) : 266 - 297
  • [44] Momentum-Based Variance-Reduced Proximal Stochastic Gradient Method for Composite Nonconvex Stochastic Optimization
    Yangyang Xu
    Yibo Xu
    Journal of Optimization Theory and Applications, 2023, 196 : 266 - 297
  • [45] Distributed Stochastic Gradient Tracking Algorithm With Variance Reduction for Non-Convex Optimization
    Jiang, Xia
    Zeng, Xianlin
    Sun, Jian
    Chen, Jie
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5310 - 5321
  • [46] A Communication-efficient Linearly Convergent Algorithm with Variance Reduction for Distributed Stochastic Optimization
    Lei, Jinlong
    Yi, Peng
    Chen, Jie
    Hong, Yiguang
    2020 EUROPEAN CONTROL CONFERENCE (ECC 2020), 2020, : 1250 - 1255
  • [47] Infeasible Deterministic, Stochastic, and Variance-Reduction Algorithms for Optimization under Orthogonality Constraints
    Ablin, Pierre
    Vary, Simon
    Gao, Bin
    Absil, P. -a.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 38
  • [48] Partial Transmit Sequences for PAPR Reduction of OFDM Signals with Stochastic Optimization Techniques
    Chen, Jung-Chieh
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2010, 56 (03) : 1229 - 1234
  • [49] Mirror Descent Strikes Again: Optimal Stochastic Convex Optimization under Infinite Noise Variance
    Vural, Nuri Mert
    Yu, Lu
    Balasubramanian, Krishnakumar
    Volgushev, Stanislav
    Erdogdu, Murat A.
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178 : 65 - 102
  • [50] A novel image noise reduction method for composite multistable stochastic resonance systems
    Jiao, Shangbin
    Shi, Jiaqiang
    Wang, Yi
    Wang, Ruijie
    HELIYON, 2023, 9 (03)