Greedy Projected Gradient-Newton Method for Sparse Logistic Regression

被引:26
|
作者
Wang, Rui [1 ]
Xiu, Naihua [1 ]
Zhang, Chao [1 ]
机构
[1] Beijing Jiaotong Univ, Dept Appl Math, Beijing 100044, Peoples R China
基金
中国国家自然科学基金;
关键词
Convergence analysis; greedy projected gradient-Newton (GPGN) algorithm; model analysis; numerical experiment; sparse logistic regression (SLR); VARIABLE SELECTION; OPTIMALITY CONDITIONS; GENE SELECTION; ALGORITHM; FRAMEWORK; PURSUIT; REGULARIZATION; APPROXIMATION; MODELS;
D O I
10.1109/TNNLS.2019.2905261
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse logistic regression (SLR), which is widely used for classification and feature selection in many fields, such as neural networks, deep learning, and bioinformatics, is the classical logistic regression model with sparsity constraints. In this paper, we perform theoretical analysis on the existence and uniqueness of the solution to the SLR, and we propose a greedy projected gradient-Newton (GPGN) method for solving the SLR. The GPGN method is a combination of the projected gradient method and the Newton method. The following characteristics show that the GPGN method achieves not only elegant theoretical results but also a remarkable numerical performance in solving the SLR: 1) the full iterative sequence generated by the GPGN method converges to a global/local minimizer of the SLR under weaker conditions; 2) the GPGN method has the properties of afinite identification for an optimal support set and local quadratic convergence; and 3) the GPGN method achieves higher accuracy and higher speed compared with a number of state-of-the-art solvers according to numerical experiments.
引用
收藏
页码:527 / 538
页数:12
相关论文
共 50 条
  • [1] An aggregation method for sparse logistic regression
    Liu, Zhe
    [J]. INTERNATIONAL JOURNAL OF DATA MINING AND BIOINFORMATICS, 2017, 17 (01) : 85 - 96
  • [2] Gradient-Newton learning algorithm for feedforward neural netwoks
    Liang, JZ
    He, XG
    Huang, DS
    [J]. INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED PROCESSING TECHNIQUES AND APPLICATIONS, VOL VI, PROCEEDINGS, 1999, : 2870 - 2874
  • [3] Logistic regression methods with truncated newton method
    Rahayu, Santi Puteri
    Zain, Jasni Mohamad
    Embong, Abdullah
    Juwari
    Purnami, Santi Wulan
    [J]. INTERNATIONAL CONFERENCE ON ADVANCES SCIENCE AND CONTEMPORARY ENGINEERING 2012, 2012, 50 : 827 - 836
  • [4] NONCONVEX SPARSE LOGISTIC REGRESSION VIA PROXIMAL GRADIENT DESCENT
    Shen, Xinyue
    Gu, Yuantao
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 4079 - 4083
  • [5] Kernel logistic regression using truncated Newton method
    Maalouf, Maher
    Trafalis, Theodore B.
    Adrianto, Indra
    [J]. COMPUTATIONAL MANAGEMENT SCIENCE, 2011, 8 (04) : 415 - 428
  • [6] Quantum kernel logistic regression based Newton method
    Ning, Tong
    Yang, Youlong
    Du, Zhenye
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2023, 611
  • [7] A Greedy Newton-Type Method for Multiple Sparse Constraint Problem
    Sun, Jun
    Kong, Lingchen
    Qu, Biao
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2023, 196 (03) : 829 - 854
  • [8] A Greedy Newton-Type Method for Multiple Sparse Constraint Problem
    Jun Sun
    Lingchen Kong
    Biao Qu
    [J]. Journal of Optimization Theory and Applications, 2023, 196 : 829 - 854
  • [9] A gradient-based forward greedy algorithm for sparse Gaussian process regression
    Sun, Ping
    Yao, Xin
    [J]. TRENDS IN NEURAL COMPUTATION, 2007, 35 : 241 - +
  • [10] Sparse greedy Gaussian process regression
    Smola, AJ
    Bartlett, P
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 13, 2001, 13 : 619 - 625