共 50 条
- [1] Minimizing finite sums with the stochastic average gradient [J]. Mathematical Programming, 2017, 162 : 83 - 112
- [3] Conditional Accelerated Lazy Stochastic Gradient Descent [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
- [4] Accelerated and Unaccelerated Stochastic Gradient Descent in Model Generality [J]. Mathematical Notes, 2020, 108 : 511 - 522
- [5] Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2022, 15 (02): : 738 - 761
- [7] STOCHASTIC GRADIENT DESCENT WITH FINITE SAMPLES SIZES [J]. 2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2016,
- [8] Online and Stochastic Universal Gradient Methods for Minimizing Regularized Holder Continuous Finite Sums in Machine Learning [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PART I, 2015, 9077 : 369 - 379
- [9] An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [10] Lightweight Stochastic Optimization for Minimizing Finite Sums with Infinite Data [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80