Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization

被引:0
|
作者
Yu, Yue [1 ]
Huang, Longbo [1 ]
机构
[1] Tsinghua Univ, Inst Interdisciplinary Informat Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the stochastic composition optimization problem proposed in [Wang et al., 2016a], which has applications ranging from estimation to statistical and machine learning. We propose the first ADMM-based algorithm named com-SVR-ADMM, and show that com-SVR-ADMM converges linearly for strongly convex and Lipschitz smooth objectives, and has a convergence rate of O(log S/S), which improves upon the O(S-4/9) rate in [Wang et al., 2016b] when the objective is convex and Lipschitz smooth. Moreover, com-SVR-ADMM possesses a rate of O(1/root S) when the objective is convex but without Lipschitz smoothness. We also conduct experiments and show that it outperforms existing algorithms.
引用
收藏
页码:3364 / 3370
页数:7
相关论文
共 50 条
  • [1] Accelerated Variance Reduced Stochastic ADMM
    Liu, Yuanyuan
    Shang, Fanhua
    Cheng, James
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2287 - 2293
  • [2] Stochastic Variance Reduced Primal Dual Algorithms for Empirical Composition Optimization
    Devraj, Adithya M.
    Chen, Jianshu
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [3] Variance Reduced Stochastic Optimization for PCA and PLS
    Min, Erxue
    Cui, Jianjing
    Long, Jun
    [J]. 2017 10TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID), VOL. 1, 2017, : 383 - 388
  • [4] A stochastic variance reduced gradient method with adaptive step for stochastic optimization
    Li, Jing
    Xue, Dan
    Liu, Lei
    Qi, Rulei
    [J]. OPTIMAL CONTROL APPLICATIONS & METHODS, 2024, 45 (03): : 1327 - 1342
  • [5] Loopless Variance Reduced Stochastic ADMM for Equality Constrained Problems in IoT Applications
    Liu, Yuanyuan
    Geng, Jiacheng
    Shang, Fanhua
    An, Weixin
    Liu, Hongying
    Zhu, Qi
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (03): : 2293 - 2303
  • [6] Stochastic Variance Reduced Optimization for Nonconvex Sparse Learning
    Li, Xingguo
    Zhao, Tuo
    Arora, Raman
    Liu, Han
    Haupt, Jarvis
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [7] A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates
    Zhou, Kaiwen
    Shang, Fanhua
    Cheng, James
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [8] Variance-Reduced Decentralized Stochastic Optimization With Accelerated Convergence
    Xin, Ran
    Khan, Usman A.
    Kar, Soummya
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 6255 - 6271
  • [9] Guaranteed Sufficient Decrease for Stochastic Variance Reduced Gradient Optimization
    Shang, Fanhua
    Liu, Yuanyuan
    Zhou, Kaiwen
    Cheng, James
    Ng, Kelvin K. W.
    Yoshida, Yuichi
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [10] Variance-Reduced and Projection-Free Stochastic Optimization
    Hazan, Elad
    Luo, Haipeng
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48