Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

被引:0
|
作者
Nitanda, Atsushi [1 ,2 ]
机构
[1] Tokyo Inst Technol, Tokyo, Japan
[2] NTT DATA Math Syst Inc, Tokyo, Japan
关键词
DUAL COORDINATE ASCENT; MINIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. An important feature of the method is that it can be directly applied to general convex and optimal strongly convex problems that is a weaker condition than strong convexity. We show that our method achieves a better overall complexity for the general convex problems and linear convergence for optimal strongly convex problems. Moreover we prove the fast iteration complexity of our method. Our experiments show the effectiveness of our method.
引用
收藏
页码:195 / 203
页数:9
相关论文
共 50 条
  • [1] Minimizing finite sums with the stochastic average gradient
    Mark Schmidt
    Nicolas Le Roux
    Francis Bach
    [J]. Mathematical Programming, 2017, 162 : 83 - 112
  • [2] Minimizing finite sums with the stochastic average gradient (vol 162, pg 83, 2017)
    Schmidt, Mark
    Le Roux, Nicolas
    Bach, Francis
    [J]. MATHEMATICAL PROGRAMMING, 2017, 162 (1-2) : 113 - 113
  • [3] Conditional Accelerated Lazy Stochastic Gradient Descent
    Lan, Guanghui
    Pokutta, Sebastian
    Zhou, Yi
    Zink, Daniel
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [4] Accelerated and Unaccelerated Stochastic Gradient Descent in Model Generality
    D. M. Dvinskikh
    A. I. Tyurin
    A. V. Gasnikov
    C. C. Omel’chenko
    [J]. Mathematical Notes, 2020, 108 : 511 - 522
  • [5] Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent
    Wang, Bao
    Nguyen, Tan
    Sun, Tao
    Bertozzi, Andrea L.
    Baraniuk, Richard G.
    Osher, Stanley J.
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2022, 15 (02): : 738 - 761
  • [6] Accelerated and Unaccelerated Stochastic Gradient Descent in Model Generality
    Dvinskikh, D. M.
    Tyurin, A., I
    Gasnikov, A., V
    Omel'chenko, C. C.
    [J]. MATHEMATICAL NOTES, 2020, 108 (3-4) : 511 - 522
  • [7] STOCHASTIC GRADIENT DESCENT WITH FINITE SAMPLES SIZES
    Yuan, Kun
    Ying, Bicheng
    Vlaski, Stefan
    Sayed, Ali H.
    [J]. 2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2016,
  • [8] Online and Stochastic Universal Gradient Methods for Minimizing Regularized Holder Continuous Finite Sums in Machine Learning
    Shi, Ziqiang
    Liu, Rujie
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PART I, 2015, 9077 : 369 - 379
  • [9] An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums
    Hendrikx, Hadrien
    Bach, Francis
    Massoulie, Laurent
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Lightweight Stochastic Optimization for Minimizing Finite Sums with Infinite Data
    Zheng, Shuai
    Kwok, James T.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80