Distributed Primal-Dual Proximal Method for Regularized Empirical Risk Minimization

被引:0
|
作者
Khuzani, Masoud Badiei [1 ]
机构
[1] Harvard Univ, John A Paulson Sch Engn & Appl Sci, Cambridge, MA 02138 USA
关键词
Empirical risk; Distributed Optimization; Primal-Dual Method; OPTIMIZATION;
D O I
10.1109/ICMLA.2018.00152
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most high-dimensional estimation and classification methods propose to minimize a loss function (empirical risk) that is the sum of losses associated with each observed data point. We consider the special case of binary classification problems, where the loss is a function of the inner product of the feature vectors and a weight vector. For this special class of classification tasks, the empirical risk minimization problem can be recast as a minimax optimization which has a unique saddle point when the losses are smooth functions. We propose a distributed proximal primal-dual method to solve the minimax problem. We also analyze the convergence of the proposed primal-dual method and show its convergence to the unique saddle point. To prove the convergence results, we present a novel analysis of the consensus terms that takes into account the non-Euclidean geometry of the parameter space. We also numerically verify the convergence of the proposed algorithm for the logistic regression on the Erdos-Reyni random graphs and lattices.
引用
收藏
页码:938 / 945
页数:8
相关论文
共 50 条
  • [1] Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
    Zhang, Yuchen
    Xiao, Lin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 353 - 361
  • [2] Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
    Zhang, Yuchen
    Xiao, Lin
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [3] Distributed Regularized Primal-Dual Method
    Badiei, Masoud
    Li, Na
    [J]. 2016 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2016, : 540 - 544
  • [4] Regularized Primal-Dual Subgradient Method for Distributed Constrained Optimization
    Yuan, Deming
    Ho, Daniel W. C.
    Xu, Shengyuan
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (09) : 2109 - 2118
  • [5] A primal-dual algorithm for risk minimization
    Kouri, Drew P.
    Surowiec, Thomas M.
    [J]. MATHEMATICAL PROGRAMMING, 2022, 193 (01) : 337 - 363
  • [6] Doubly Greedy Primal-Dual Coordinate Descent for Sparse Empirical Risk Minimization
    Lei, Qi
    Yen, Ian E. H.
    Wu, Chao-yuan
    Dhillon, Inderjit S.
    Ravikumar, Pradeep
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [7] On Stochastic Primal-Dual Hybrid Gradient Approach for Compositely Regularized Minimization
    Qiao, Linbo
    Lin, Tianyi
    Jiang, Yu-Gang
    Yang, Fan
    Liu, Wei
    Lu, Xicheng
    [J]. ECAI 2016: 22ND EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, 285 : 167 - 174
  • [8] New Primal-Dual Proximal Algorithm for Distributed Optimization
    Latafat, Puya
    Stella, Lorenzo
    Patrinos, Panagiotis
    [J]. 2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC), 2016, : 1959 - 1964
  • [9] Stochastic Primal-Dual Method for Empirical Risk Minimization with O(1) Per-Iteration Complexity
    Tan, Conghui
    Zhang, Tong
    Ma, Shiqian
    Liu, Ji
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [10] Stochastic Primal-Dual Proximal ExtraGradient descent for compositely regularized optimization
    Lin, Tianyi
    Qiao, Linbo
    Zhang, Teng
    Feng, Jiashi
    Zhang, Bofeng
    [J]. NEUROCOMPUTING, 2018, 273 : 516 - 525