Advanced algorithms for penalized quantile and composite quantile regression

被引:20
|
作者
Pietrosanu, Matthew [1 ]
Gao, Jueyu [1 ]
Kong, Linglong [1 ]
Jiang, Bei [1 ]
Niu, Di [2 ]
机构
[1] Univ Alberta, Dept Math & Stat Sci, Edmonton, AB, Canada
[2] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Adaptive lasso; Alternating direction method of multipliers; Coordinate descent; Interior point; Majorize minimization; COORDINATE DESCENT METHOD; VARIABLE SELECTION; CONVERGENCE; LIKELIHOOD; LASSO;
D O I
10.1007/s00180-020-01010-1
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we discuss a family of robust, high-dimensional regression models for quantile and composite quantile regression, both with and without an adaptive lasso penalty for variable selection. We reformulate these quantile regression problems and obtain estimators by applying the alternating direction method of multipliers (ADMM), majorize-minimization (MM), and coordinate descent (CD) algorithms. Our new approaches address the lack of publicly available methods for (composite) quantile regression, especially for high-dimensional data, both with and without regularization. Through simulation studies, we demonstrate the need for different algorithms applicable to a variety of data settings, which we implement in the cqrReg package for R. For comparison, we also introduce the widely used interior point (IP) formulation and test our methods against the IP algorithms in the existing quantreg package. Our simulation studies show that each of our methods, particularly MM and CD, excel in different settings such as with large or high-dimensional data sets, respectively, and outperform the methods currently implemented in quantreg. The ADMM approach offers specific promise for future developments in its amenability to parallelization and scalability.
引用
收藏
页码:333 / 346
页数:14
相关论文
共 50 条
  • [1] Advanced algorithms for penalized quantile and composite quantile regression
    Matthew Pietrosanu
    Jueyu Gao
    Linglong Kong
    Bei Jiang
    Di Niu
    [J]. Computational Statistics, 2021, 36 : 333 - 346
  • [2] Hierarchically penalized quantile regression
    Kang, Jongkyeong
    Bang, Sungwan
    Jhun, Myoungshic
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2016, 86 (02) : 340 - 356
  • [3] Penalized quantile regression tree
    Kim, Jaeoh
    Cho, HyungJun
    Bang, Sungwan
    [J]. KOREAN JOURNAL OF APPLIED STATISTICS, 2016, 29 (07) : 1361 - 1371
  • [4] Group penalized quantile regression
    Ouhourane, Mohamed
    Yang, Yi
    Benedet, Andrea L.
    Oualkacha, Karim
    [J]. STATISTICAL METHODS AND APPLICATIONS, 2022, 31 (03): : 495 - 529
  • [5] Group penalized quantile regression
    Mohamed Ouhourane
    Yi Yang
    Andréa L. Benedet
    Karim Oualkacha
    [J]. Statistical Methods & Applications, 2022, 31 : 495 - 529
  • [6] Penalized expectile regression: an alternative to penalized quantile regression
    Lina Liao
    Cheolwoo Park
    Hosik Choi
    [J]. Annals of the Institute of Statistical Mathematics, 2019, 71 : 409 - 438
  • [7] Penalized expectile regression: an alternative to penalized quantile regression
    Liao, Lina
    Park, Cheolwoo
    Choi, Hosik
    [J]. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2019, 71 (02) : 409 - 438
  • [9] Two-Stage Penalized Composite Quantile Regression with Grouped Variables
    Bang, Sungwan
    Jhun, Myoungshic
    [J]. COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2013, 20 (04) : 259 - 270
  • [10] Penalized quantile regression for dynamic panel data
    Galvao, Antonio F.
    Montes-Rojas, Gabriel V.
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2010, 140 (11) : 3476 - 3497