Tuning parameter calibration for l1-regularized logistic regression

被引:10
|
作者
Li, Wei [1 ]
Lederer, Johannes [2 ,3 ]
机构
[1] Peking Univ, Sch Math Sci, Beijing, Peoples R China
[2] Univ Washington, Dept Stat, Seattle, WA 98195 USA
[3] Univ Washington, Dept Biostat, Seattle, WA 98195 USA
关键词
Feature selection; Penalized logistic regression; Tuning parameter calibration; VARIABLE SELECTION; MODEL SELECTION; LASSO; CLASSIFICATION; PREDICTION;
D O I
10.1016/j.jspi.2019.01.006
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Feature selection is a standard approach to understanding and modeling high-dimensional classification data, but the corresponding statistical methods hinge on tuning parameters that are difficult to calibrate. In particular, existing calibration schemes in the logistic regression framework lack any finite sample guarantees. In this paper, we introduce a novel calibration scheme for l(1)-penalized logistic regression. It is based on simple tests along the tuning parameter path and is equipped with optimal guarantees for feature selection. It is also amenable to easy and efficient implementations, and it rivals or outmatches existing methods in simulations and real data applications. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:80 / 98
页数:19
相关论文
共 50 条
  • [21] GTB-PPI: Predict Protein-protein Interactions Based on L1-regularized Logistic Regression and Gradient Tree Boosting
    Yu, Bin
    Chen, Cheng
    Zhou, Hongyan
    Liu, Bingqiang
    Ma, Qin
    GENOMICS PROTEOMICS & BIOINFORMATICS, 2020, 18 (05) : 582 - 592
  • [22] A pseudo-heuristic parameter selection rule for l1-regularized minimization problems
    Li, Chong-Jun
    Zhong, Yi-Jun
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2018, 333 : 1 - 19
  • [23] Logistic regression for parameter tuning on an evolutionary algorithm
    Ramos, ICO
    Goldbarg, MC
    Goldbarg, EG
    Neto, ADD
    2005 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1-3, PROCEEDINGS, 2005, : 1061 - 1068
  • [24] A NEW SPECTRAL METHOD FOR l1-REGULARIZED MINIMIZATION
    Wu, Lei
    Sun, Zhe
    INVERSE PROBLEMS AND IMAGING, 2015, 9 (01) : 257 - 272
  • [25] Sparse identification of dynamical systems by reweighted l1-regularized least absolute deviation regression
    He, Xin
    Sun, Zhongkui
    COMMUNICATIONS IN NONLINEAR SCIENCE AND NUMERICAL SIMULATION, 2024, 131
  • [26] Ising model selection using l1-regularized linear regression: a statistical mechanics analysis
    Meng, Xiangming
    Obuchi, Tomoyuki
    Kabashima, Yoshiyuki
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (11):
  • [27] L1-REGULARIZED RECONSTRUCTION FOR TRACTION FORCE MICROSCOPY
    Sune-Aunon, Alejandro
    Jorge-Penas, Alvaro
    Van Oosterwyck, Hans
    Munoz-Barrutia, Arrate
    2016 IEEE 13TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI), 2016, : 140 - 144
  • [28] L1-Regularized Reconstruction Error as Alpha Matte
    Johnson, Jubin
    Cholakkal, Hisham
    Rajan, Deepu
    IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (04) : 407 - 411
  • [29] Ising Model Selection Using l1-Regularized Linear Regression: A Statistical Mechanics Analysis
    Meng, Xiangming
    Obuchi, Tomoyuki
    Kabashima, Yoshiyuki
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [30] Occlusion Handling with l1-Regularized Sparse Reconstruction
    Li, Wei
    Li, Bing
    Zhang, Xiaoqin
    Hu, Weiming
    Wang, Hanzi
    Luo, Guan
    COMPUTER VISION - ACCV 2010, PT IV, 2011, 6495 : 630 - +