Distributed Parallel Sparse Multinomial Logistic Regression

被引:2
|
作者
Lei, Dajiang [1 ]
Du, Meng [2 ]
Chen, Hao [1 ]
Li, Zhixing [1 ]
Wu, Yu [3 ]
机构
[1] Chongqing Univ Posts & Telecommun, Coll Comp, Chongqing 400065, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Coll Software Engn, Chongqing 400065, Peoples R China
[3] Chongqing Univ Posts & Telecommun, Inst Web Intelligence, Chongqing 400065, Peoples R China
关键词
Alternating Direction Method of Multipliers; big data; distributed parallel; sparse multinomial logistic regression; TASK GRAPHS; ALGORITHM;
D O I
10.1109/ACCESS.2019.2913280
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse Multinomial Logistic Regression (SMLR) is widely used in the field of image classification, multi-class object recognition, and so on, because it has the function of embedding feature selection during classification. However, it cannot meet the time and memory requirements for processing large-scale data. We have reinvestigated the classification accuracy and running efficiency of the algorithm for solving SMLR problems using the Alternating Direction Method of Multipliers (ADMM), which is called fast SMLR (FSMLR) algorithm in this paper. By reformulating the optimization problem of FSMLR, we transform the serial convex optimization problem to the distributed convex optimization problem, i.e., global consensus problem and sharing problem. Based on the distributed optimization problem, we propose two distribute parallel SMLR algorithms, sample partitioning-based distributed SMLR (SP-SMLR), and feature partitioning-based distributed SMLR (FP-SMLR), for a large-scale sample and large-scale feature datasets in big data scenario, respectively. The experimental results show that the FSMLR algorithm has higher accuracy than the original SMLR algorithm. The big data experiments show that our distributed parallel SMLR algorithms can scale for massive samples and large-scale features, with high precision. In a word, our proposed serial and distribute SMLR algorithms outperform the state-of-the-art algorithms.
引用
收藏
页码:55496 / 55508
页数:13
相关论文
共 50 条
  • [1] PIANO: A fast parallel iterative algorithm for multinomial and sparse multinomial logistic regression
    Jyothi, R.
    Babu, P.
    [J]. SIGNAL PROCESSING, 2022, 194
  • [2] Approximate Sparse Multinomial Logistic Regression for Classification
    Kayabol, Koray
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (02) : 490 - 493
  • [3] Multiclass Classification by Sparse Multinomial Logistic Regression
    Abramovich, Felix
    Grinshtein, Vadim
    Levy, Tomer
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2021, 67 (07) : 4637 - 4646
  • [4] Communication-efficient distributed large-scale sparse multinomial logistic regression
    Lei, Dajiang
    Huang, Jie
    Chen, Hao
    Li, Jie
    Wu, Yu
    [J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (18):
  • [5] Confidence intervals for multinomial logistic regression in sparse data
    Bull, Shelley B.
    Lewinger, Juan Pablo
    Lee, Sophia S. F.
    [J]. STATISTICS IN MEDICINE, 2007, 26 (04) : 903 - 918
  • [6] Sparse multinomial logistic regression: Fast algorithms and generalization bounds
    Krishnapuram, B
    Carin, L
    Figueiredo, MAT
    Hartemink, AJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (06) : 957 - 968
  • [7] Fast sparse multinomial logistic regression and big data parallelism
    [J]. Zhang, Liping (zhanglp@cqupt.edu.cn), 1600, Universidad Central de Venezuela (55):
  • [8] Sparse Multinomial Logistic Regression via Approximate Message Passing
    Byrne, Evan
    Schniter, Philip
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (21) : 5485 - 5498
  • [9] Multinomial logistic regression
    Kwak, C
    Clayton-Matthews, A
    [J]. NURSING RESEARCH, 2002, 51 (06) : 406 - 412
  • [10] Subspace quadratic regularization method for group sparse multinomial logistic regression
    Wang, Rui
    Xiu, Naihua
    Toh, Kim-Chuan
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (03) : 531 - 559