Stochastic Optimization with Importance Sampling for Regularized Loss Minimization

被引:0
|
作者
Zhao, Peilin [1 ,2 ,3 ]
Zhang, Tong [2 ,3 ]
机构
[1] ASTAR, Inst Infocomm Res, Data Analyt Dept, Singapore, Singapore
[2] Rutgers State Univ, Dept Stat & Biostat, Piscataway, NJ 08854 USA
[3] Baidu Res, Big Data Lab, Beijing, Peoples R China
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37 | 2015年 / 37卷
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Uniform sampling of training data has been commonly used in traditional stochastic optimization algorithms such as Proximal Stochastic Mirror Descent (prox-SMD) and Proximal Stochastic Dual Coordinate Ascent (prox-SDCA). Although uniform sampling can guarantee that the sampled stochastic quantity is an unbiased estimate of the corresponding true quantity, the resulting estimator may have a rather high variance, which negatively affects the convergence of the underlying optimization procedure. In this paper we study stochastic optimization, including prox-SMD and prox-SDCA, with importance sampling, which improves the convergence rate by reducing the stochastic variance. We theoretically analyze the algorithms and empirically validate their effectiveness.
引用
收藏
页码:1 / 9
页数:9
相关论文
共 50 条
  • [1] Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
    Shalev-Shwartz, Shai
    Zhang, Tong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 567 - 599
  • [2] Stochastic Methods for l1-regularized Loss Minimization
    Shalev-Shwartz, Shai
    Tewari, Ambuj
    JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 1865 - 1892
  • [3] Stochastic Dual Coordinate Ascent methods for regularized loss minimization
    Shalev-Shwartz, S. (SHAIS@CS.HUJI.AC.IL), 1600, Microtome Publishing (14):
  • [4] Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
    Shalev-Shwartz, Shai
    Zhang, Tong
    MATHEMATICAL PROGRAMMING, 2016, 155 (1-2) : 105 - 145
  • [5] Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
    Shai Shalev-Shwartz
    Tong Zhang
    Mathematical Programming, 2016, 155 : 105 - 145
  • [6] Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
    Shalev-Shwartz, Shai
    Zhang, Tong
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 1), 2014, 32
  • [7] Stochastic optimization: Excess cost and importance sampling
    Pritchard, G
    Zakeri, G
    LIMIT THEOREMS IN PROBABILITY AND STATISTICS, VOL II, 2002, : 473 - 484
  • [8] Adaptive noisy importance sampling for stochastic optimization
    Deniz Akyildiz, Omer
    Marino, Ines P.
    Miguez, Joaquin
    2017 IEEE 7TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP), 2017,
  • [9] A general distributed dual coordinate optimization framework for regularized loss minimization
    Zheng, Shun
    Wang, Jialei
    Xia, Fen
    Xu, Wei
    Zhang, Tong
    Journal of Machine Learning Research, 2017, 18 : 1 - 52
  • [10] A General Distributed Dual Coordinate Optimization Framework for Regularized Loss Minimization
    Zheng, Shun
    Wang, Jialei
    Xia, Fen
    Xu, Wei
    Zhang, Tong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18