Hybrid Random/Deterministic Parallel Algorithms for Convex and Nonconvex Big Data Optimization

被引:48
|
作者
Daneshmand, Amir [1 ]
Facchinei, Francisco [2 ]
Kungurtsev, Vyacheslav [3 ]
Scutari, Gesualdo [4 ]
机构
[1] SUNY Buffalo, Dept Elect Engn, Buffalo, NY 14228 USA
[2] Univ Roma La Sapienza, Dept Comp Control & Management Engneering, I-00185 Rome, Italy
[3] Czech Tech Univ, Fac Elect Engn, Dept Comp Sci, Agent Technol Ctr, Prague 16636, Czech Republic
[4] Purdue Univ, Sch Ind Engn, W Lafayette, IN 47907 USA
基金
美国国家科学基金会;
关键词
Jacobi method; nonconvex problems; parallel and distributed methods; random selections; sparse solution; COORDINATE DESCENT ALGORITHM; MINIMIZATION; CONVERGENCE; SHRINKAGE;
D O I
10.1109/TSP.2015.2436357
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We propose a decomposition framework for the parallel optimization of the sum of a differentiable (possibly nonconvex) function and a nonsmooth (possibly nonseparable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel parallel, hybrid random/deterministic decomposition scheme wherein, at each iteration, a subset of (block) variables is updated at the same time by minimizing a convex surrogate of the original nonconvex function. To tackle huge-scale problems, the (block) variables to be updated are chosen according to a mixed random and deterministic procedure, which captures the advantages of both pure deterministic and random update-based schemes. Almost sure convergence of the proposed scheme is established. Numerical results show that on huge-scale problems the proposed hybrid random/deterministic algorithm compares favorably to random and deterministic schemes on both convex and nonconvex problems.
引用
下载
收藏
页码:3914 / 3929
页数:16
相关论文
共 50 条
  • [31] Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
    Andrei Patrascu
    Ion Necoara
    Journal of Global Optimization, 2015, 61 : 19 - 46
  • [32] Parallel knowledge acquisition algorithms for big data using MapReduce
    Jin Qian
    Min Xia
    Xiaodong Yue
    International Journal of Machine Learning and Cybernetics, 2018, 9 : 1007 - 1021
  • [33] Parallel Computing Algorithms for Big Data Frequent Pattern Mining
    Shaik, Subhani
    Subhani, Shaik
    Devarakonda, Nagaraju
    Nagamani, Ch.
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND DATA ENGINEERING, 2018, 9 : 113 - 123
  • [34] Parallel knowledge acquisition algorithms for big data using MapReduce
    Qian, Jin
    Xia, Min
    Yue, Xiaodong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2018, 9 (06) : 1007 - 1021
  • [35] Parallel coordinate descent methods for big data optimization
    Peter Richtárik
    Martin Takáč
    Mathematical Programming, 2016, 156 : 433 - 484
  • [36] Parallel coordinate descent methods for big data optimization
    Richtarik, Peter
    Takac, Martin
    MATHEMATICAL PROGRAMMING, 2016, 156 (1-2) : 433 - 484
  • [37] Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution Under Random Designs
    Chen, Yuxin
    Fan, Jianqing
    Wang, Bingyan
    Yan, Yuling
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (542) : 858 - 868
  • [38] A hybrid approach for MRF optimization problems: Combination of stochastic sampling and deterministic algorithms
    Kim, Wonsik
    Lee, Kyoung Mu
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2011, 115 (12) : 1623 - 1637
  • [39] Big data regression with parallel enhanced and convex incremental extreme learning machines
    Kokkinos, Yiannis
    Margaritis, Konstantinos G.
    COMPUTATIONAL INTELLIGENCE, 2018, 34 (03) : 875 - 894
  • [40] Parallel and Distributed Machine Learning Algorithms for Scalable Big Data Analytics
    Bal, Henri
    Pal, Arindam
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2020, 108 : 1159 - 1161