Model Selection for High-Dimensional Quadratic Regression via Regularization

被引:61
|
作者
Hao, Ning [1 ]
Feng, Yang [2 ]
Zhang, Hao Helen [1 ]
机构
[1] Univ Arizona, Dept Math, Tucson, AZ 85721 USA
[2] Columbia Univ, Dept Stat, New York, NY 10027 USA
关键词
Generalized quadratic regression; Interaction selection; LASSO; Marginality principle; Variable selection; HIERARCHICAL VARIABLE SELECTION; GENERALIZED LINEAR-MODELS; PENALIZED LIKELIHOOD; PATH ALGORITHM; DESIGNED EXPERIMENTS; COORDINATE DESCENT; LASSO;
D O I
10.1080/01621459.2016.1264956
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Quadratic regression (QR) models naturally extend linear models by considering interaction effects between the covariates. To conduct model selection in QR, it is important to maintain the hierarchical model structure between main effects and interaction effects. Existing regularization methods generally achieve this goal by solving complex optimization problems, which usually demands high computational cost and hence are not feasible for high-dimensional data. This article focuses on scalable regularization methods for model selection in high-dimensional QR. We first consider two-stage regularization methods and establish theoretical properties of the two-stage LASSO. Then, a new regularization method, called regularization algorithm under marginality principle (RAMP), is proposed to compute a hierarchy-preserving regularization solution path efficiently. Both methods are further extended to solve generalized QR models. Numerical results are also shown to demonstrate performance of the methods.
引用
收藏
页码:615 / 625
页数:11
相关论文
共 50 条
  • [21] Model selection properties of forward selection and sequential cross-validation for high-dimensional regression
    Wieczorek, Jerzy
    Lei, Jing
    [J]. CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2022, 50 (02): : 454 - 470
  • [22] A stepwise regression algorithm for high-dimensional variable selection
    Hwang, Jing-Shiang
    Hu, Tsuey-Hwa
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2015, 85 (09) : 1793 - 1806
  • [23] A Survey of Tuning Parameter Selection for High-Dimensional Regression
    Wu, Yunan
    Wang, Lan
    [J]. ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 7, 2020, 2020, 7 : 209 - 226
  • [24] Model Selection for High-Dimensional Data
    Owrang, Arash
    Jansson, Magnus
    [J]. 2016 50TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, 2016, : 606 - 609
  • [25] Efficient Learning and Feature Selection in High-Dimensional Regression
    Ting, Jo-Anne
    D'Souza, Aaron
    Vijayakumar, Sethu
    Schaal, Stefan
    [J]. NEURAL COMPUTATION, 2010, 22 (04) : 831 - 886
  • [26] Preconditioning for feature selection and regression in high-dimensional problems'
    Paul, Debashis
    Bair, Eric
    Hastie, Trevor
    Tibshirani, Robert
    [J]. ANNALS OF STATISTICS, 2008, 36 (04): : 1595 - 1618
  • [27] Variable Selection Diagnostics Measures for High-Dimensional Regression
    Nan, Ying
    Yang, Yuhong
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2014, 23 (03) : 636 - 656
  • [28] High-Dimensional Multivariate Linear Regression with Weighted Nuclear Norm Regularization
    Suh, Namjoon
    Lin, Li-Hsiang
    Huo, Xiaoming
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2024,
  • [29] HIGH-DIMENSIONAL LINEAR REGRESSION WITH HARD THRESHOLDING REGULARIZATION: THEORY AND ALGORITHM
    Kang, Lican
    Lai, Yanming
    Liu, Yanyan
    Luo, Yuan
    Zhang, Jing
    [J]. JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2023, 19 (03) : 2104 - 2122
  • [30] L0-regularization for high-dimensional regression with corrupted data
    Zhang, Jie
    Li, Yang
    Zhao, Ni
    Zheng, Zemin
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2024, 53 (01) : 215 - 231