Feature screening strategy for non-convex sparse logistic regression with log sum penalty

被引:6
|
作者
Yuan, Min [1 ]
Xu, Yitian [2 ]
机构
[1] China Agr Univ, Coll Informat & Elect Engn, Beijing 100083, Peoples R China
[2] China Agr Univ, Coll Sci, Beijing 100083, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Non-convex; Majorization minimization; Log sum penalty; Strong global concavity bound; VARIABLE SELECTION;
D O I
10.1016/j.ins.2022.12.105
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
L1 logistic regression is an efficient classification algorithm. Leveraging on the convexity of the model and the sparsity of L1 norm, techniques named safe screening rules can help accelerate solvers for high-dimensional data. Recently, Log Sum Penalty (LSP) provides bet-ter theoretical guarantees in identifying relevant variables, and sparse logistic regression with LSP has been proposed. However, due to the non-convexity of this model, the training process is time-consuming. Furthermore, the existing safe screening rules cannot be directly applied to accelerate it. To deal with this issue, in this paper, based on the iterative majorization minimization (MM) principle, we construct a novel method that can effec-tively save training time. To do this, we first design a feature screening strategy for the inner solver and then build another rule to propagate screened features between the iter-ations of MM. After that, we introduce a modified feature screening strategy to further accelerate the computational speed, which can obtain a smaller safe region thanks to reconstructing the strong global concavity bound. Moreover, our rules can be applied to other non-convex cases. Experiments on nine benchmark datasets verify the effectiveness and security of our algorithm.(c) 2022 Elsevier Inc. All rights reserved.
引用
收藏
页码:732 / 747
页数:16
相关论文
共 50 条
  • [31] Dual-graph with non-convex sparse regularization for multi-label feature selection
    Sun, Zhenzhen
    Xie, Hao
    Liu, Jinghua
    Gou, Jin
    Yu, Yuanlong
    APPLIED INTELLIGENCE, 2023, 53 (18) : 21227 - 21247
  • [32] Zero-Norm ELM with Non-convex Quadratic Loss Function for Sparse and Robust Regression
    Xiaoxue Wang
    Kuaini Wang
    Yanhong She
    Jinde Cao
    Neural Processing Letters, 2023, 55 : 12367 - 12399
  • [33] Zero-Norm ELM with Non-convex Quadratic Loss Function for Sparse and Robust Regression
    Wang, Xiaoxue
    Wang, Kuaini
    She, Yanhong
    Cao, Jinde
    NEURAL PROCESSING LETTERS, 2023, 55 (09) : 12367 - 12399
  • [34] Rolling bearing fault feature extraction using non-convex periodic group sparse method
    Hai, Bin
    Jiang, Hongkai
    Yao, Pei
    Wang, Kaibo
    Yao, Renhe
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2021, 32 (10)
  • [35] Dual-graph with non-convex sparse regularization for multi-label feature selection
    Zhenzhen Sun
    Hao Xie
    Jinghua Liu
    Jin Gou
    Yuanlong Yu
    Applied Intelligence, 2023, 53 : 21227 - 21247
  • [36] A penalty-based aggregation operator for non-convex intervals
    Beliakov, Gleb
    James, Simon
    KNOWLEDGE-BASED SYSTEMS, 2014, 70 : 335 - 344
  • [37] Clustering by Orthogonal NMF Model and Non-Convex Penalty Optimization
    Wang, Shuai
    Chang, Tsung-Hui
    Cui, Ying
    Pang, Jong-Shi
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 5273 - 5288
  • [38] Beam Orientation Optimization with Non-Convex Group Sparsity Penalty
    O'Connor, D.
    Nguyen, D.
    Ruan, D.
    Yu, V.
    Sheng, K.
    MEDICAL PHYSICS, 2017, 44 (06) : 3225 - 3225
  • [39] Sparse recovery by non-convex optimization - instance optimality
    Saab, Rayan
    Yilmaz, Oezguer
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2010, 29 (01) : 30 - 48
  • [40] Fast Sparse Recovery via Non-Convex Optimization
    Chen, Laming
    Gu, Yuantao
    2015 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2015, : 1275 - 1279