The Impact of Regularization on High-dimensional Logistic Regression

被引:0
|
作者
Salehi, Fariborz [1 ]
Abbasi, Ehsan [1 ]
Hassibi, Babak [1 ]
机构
[1] CALTECH, Dept Elect Engn, Pasadena, CA 91125 USA
基金
美国国家科学基金会;
关键词
GENERALIZED LINEAR-MODELS; SELECTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Logistic regression is commonly used for modeling dichotomous outcomes. In the classical setting, where the number of observations is much larger than the number of parameters, properties of the maximum likelihood estimator in logistic regression are well understood. Recently, Sur and Candes [26] have studied logistic regression in the high-dimensional regime, where the number of observations and parameters are comparable, and show, among other things, that the maximum likelihood estimator is biased. In the high-dimensional regime the underlying parameter vector is often structured (sparse, block-sparse, finite-alphabet, etc.) and so in this paper we study regularized logistic regression (RLR), where a convex regularizer that encourages the desired structure is added to the negative of the log-likelihood function. An advantage of RLR is that it allows parameter recovery even for instances where the (unconstrained) maximum likelihood estimate does not exist. We provide a precise analysis of the performance of RLR via the solution of a system of six nonlinear equations, through which any performance metric of interest (mean, mean-squared error, probability of support recovery, etc.) can be explicitly computed. Our results generalize those of Sur and Candes and we provide a detailed study for the cases of l(2)(2)-RLR and sparse (l(1)-regularized) logistic regression. In both cases, we obtain explicit expressions for various performance metrics and can find the values of the regularizer parameter that optimizes the desired performance. The theory is validated by extensive numerical simulations across a range of parameter values and problem instances.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [21] Minimax Sparse Logistic Regression for Very High-Dimensional Feature Selection
    Tan, Mingkui
    Tsang, Ivor W.
    Wang, Li
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (10) : 1609 - 1622
  • [22] STATISTICAL INFERENCE FOR GENETIC RELATEDNESS BASED ON HIGH-DIMENSIONAL LOGISTIC REGRESSION
    Ma, Rong
    Guo, Zijian
    Cai, T. Tony
    Li, Hongzhe
    STATISTICA SINICA, 2024, 34 (02) : 1023 - 1043
  • [23] Robust and sparse estimation methods for high-dimensional linear and logistic regression
    Kurnaz, Fatma Sevinc
    Hoffmann, Irene
    Filzmoser, Peter
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2018, 172 : 211 - 222
  • [24] Debiased inference for heterogeneous subpopulations in a high-dimensional logistic regression model
    Kim, Hyunjin
    Lee, Eun Ryung
    Park, Seyoung
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [25] Robust Variable Selection with Optimality Guarantees for High-Dimensional Logistic Regression
    Insolia, Luca
    Kenney, Ana
    Calovi, Martina
    Chiaromonte, Francesca
    STATS, 2021, 4 (03): : 665 - 681
  • [26] SLOE: A Faster Method for Statistical Inference in High-Dimensional Logistic Regression
    Yadlowsky, Steve
    Yun, Taedong
    McLean, Cory
    D'Amour, Alexander
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [27] Regularization Methods for High-Dimensional Instrumental Variables Regression With an Application to Genetical Genomics
    Lin, Wei
    Feng, Rui
    Li, Hongzhe
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2015, 110 (509) : 270 - 288
  • [28] Improving Penalized Logistic Regression Model with Missing Values in High-Dimensional Data
    Alharthi, Aiedh Mrisi
    Lee, Muhammad Hisyam
    Algamal, Zakariya Yahya
    INTERNATIONAL JOURNAL OF ONLINE AND BIOMEDICAL ENGINEERING, 2022, 18 (02) : 40 - 54
  • [29] Semi-Supervised Factored Logistic Regression for High-Dimensional Neuroimaging Data
    Bzdok, Danilo
    Eickenberg, Michael
    Grisel, Olivier
    Thirion, Bertrand
    Varoquaux, Gael
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [30] FDR control and power analysis for high-dimensional logistic regression via StabKoff
    Yuan, Panxu
    Kong, Yinfei
    Li, Gaorong
    STATISTICAL PAPERS, 2024, 65 (05) : 2719 - 2749