Tuning parameter calibration for l1-regularized logistic regression

被引:10
|
作者
Li, Wei [1 ]
Lederer, Johannes [2 ,3 ]
机构
[1] Peking Univ, Sch Math Sci, Beijing, Peoples R China
[2] Univ Washington, Dept Stat, Seattle, WA 98195 USA
[3] Univ Washington, Dept Biostat, Seattle, WA 98195 USA
关键词
Feature selection; Penalized logistic regression; Tuning parameter calibration; VARIABLE SELECTION; MODEL SELECTION; LASSO; CLASSIFICATION; PREDICTION;
D O I
10.1016/j.jspi.2019.01.006
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Feature selection is a standard approach to understanding and modeling high-dimensional classification data, but the corresponding statistical methods hinge on tuning parameters that are difficult to calibrate. In particular, existing calibration schemes in the logistic regression framework lack any finite sample guarantees. In this paper, we introduce a novel calibration scheme for l(1)-penalized logistic regression. It is based on simple tests along the tuning parameter path and is equipped with optimal guarantees for feature selection. It is also amenable to easy and efficient implementations, and it rivals or outmatches existing methods in simulations and real data applications. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:80 / 98
页数:19
相关论文
共 50 条
  • [1] An Improved GLMNET for L1-regularized Logistic Regression
    Yuan, Guo-Xun
    Ho, Chia-Hua
    Lin, Chih-Jen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 1999 - 2030
  • [2] Distributed Coordinate Descent for L1-regularized Logistic Regression
    Trofimov, Ilya
    Genkin, Alexander
    ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS, AIST 2015, 2015, 542 : 243 - 254
  • [3] Multiplicative updates for L1-regularized linear and logistic regression
    Sha, Fei
    Park, Y. Albert
    Saul, Lawrence K.
    ADVANCES IN INTELLIGENT DATA ANALYSIS VII, PROCEEDINGS, 2007, 4723 : 13 - +
  • [4] A Safe Feature Elimination Rule for L1-Regularized Logistic Regression
    Pan, Xianli
    Xu, Yitian
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) : 4544 - 4554
  • [5] Introducing l1-regularized Logistic Regression in Markov Networks based EDAs
    Luigi, Malago
    Matteo, Matteucci
    Gabriele, Valentini
    2011 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2011, : 1581 - 1588
  • [6] L1-regularized Logistic Regression for Event-driven Stock Market Prediction
    Luo, Si-Shu
    Weng, Yang
    Wang, Wei-Wei
    Hong, Wen-Xing
    2017 12TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND EDUCATION (ICCSE 2017), 2017, : 536 - 541
  • [7] A Fast Hybrid Algorithm for Large-Scale l1-Regularized Logistic Regression
    Shi, Jianing
    Yin, Wotao
    Osher, Stanley
    Sajda, Paul
    JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 : 713 - 741
  • [8] HIGH-DIMENSIONAL ISING MODEL SELECTION USING l1-REGULARIZED LOGISTIC REGRESSION
    Ravikumar, Pradeep
    Wainwright, Martin J.
    Lafferty, John D.
    ANNALS OF STATISTICS, 2010, 38 (03): : 1287 - 1319
  • [9] An interior-point method for large-scale l1-regularized logistic regression
    Koh, Kwangmoo
    Kim, Seung-Jean
    Boyd, Stephen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2007, 8 : 1519 - 1555
  • [10] Process-Monitoring-for-Quality - A Model Selection Criterion for l1-Regularized Logistic Regression
    Escobar, Carlos A.
    Morales-Menendez, Ruben
    47TH SME NORTH AMERICAN MANUFACTURING RESEARCH CONFERENCE (NAMRC 47), 2019, 34 : 832 - 839