An asynchronous subgradient-proximal method for solving additive convex optimization problems

被引:0
|
作者
Tipsuda Arunrat
Sakrapee Namsak
Nimit Nimana
机构
[1] Khon Kaen University,Department of Mathematics, Faculty of Science
[2] Vidyasirimedhi Institute of Science and Technology (VISTEC),School of Information Science and Technology
关键词
Convex optimization; Subgradient-proximal method; Delay; Convergence analysis; 65K05; 65K10; 90C06; 90C25;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper, we consider additive convex optimization problems in which the objective function is the sum of a large number of convex nondifferentiable cost functions. We assume that each cost function is specifically written as the sum of two convex nondifferentiable functions in which one function is appropriate for the subgradient method, and another one is not. To this end, we propose a distributed optimization algorithm based on the subgradient and proximal methods. The proposed method is also governed by an asynchronous feature that allows time-varying delays when computing the subgradients. We prove the convergences of function values of iterates to the optimal value. To demonstrate the efficiency of the presented theoretical result, we investigate the binary classification problem via support vector machine learning.
引用
收藏
页码:3911 / 3936
页数:25
相关论文
共 50 条