DISTRIBUTED SPARSE COMPOSITE QUANTILE REGRESSION IN ULTRAHIGH DIMENSIONS

被引:3
|
作者
Chen, Canyi [1 ]
Gu, Yuwen [2 ]
Zou, Hui [3 ]
Zhu, Liping [4 ,5 ]
机构
[1] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[2] Univ Connecticut, Dept Stat, Storrs, CT 06269 USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
[4] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[5] Renmin Univ China, Ctr Appl Stat, Beijing 100872, Peoples R China
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Composite quantile regression; distributed estimation; ef-ficiency; heavy-tailed noise; support recovery; VARIABLE SELECTION; FRAMEWORK; EFFICIENT;
D O I
10.5705/ss.202022.0095
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We examine distributed estimation and support recovery for ultrahigh dimensional linear regression models under a potentially arbitrary noise distribution. The composite quantile regression is an efficient alternative to the least squares method, and provides robustness against heavy-tailed noise while maintaining reasonable efficiency in the case of light-tailed noise. The highly nonsmooth nature of the composite quantile regression loss poses challenges to both the theoretical and the computational development in an ultrahigh-dimensional distributed estimation setting. Thus, we cast the composite quantile regression into the least squares framework, and propose a distributed algorithm based on an approximate Newton method. This algorithm is efficient in terms of both computation and communication, and requires only gradient information to be communicated between the machines. We show that the resultant distributed estimator attains a near-oracle rate after a constant number of communications, and provide theoretical guarantees for its estimation and support recovery accuracy. Extensive experiments demonstrate the competitive empirical performance of our algorithm.
引用
收藏
页码:1143 / 1167
页数:25
相关论文
共 50 条
  • [21] Orthogonal Matching Pursuit for Sparse Quantile Regression
    Aravkin, Aleksandr
    Lozano, Aurelie
    Luss, Ronny
    Kambadur, Prabhanjan
    2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2014, : 11 - 19
  • [22] An efficient algorithm for structured sparse quantile regression
    Vahid Nassiri
    Ignace Loris
    Computational Statistics, 2014, 29 : 1321 - 1343
  • [23] Advanced algorithms for penalized quantile and composite quantile regression
    Pietrosanu, Matthew
    Gao, Jueyu
    Kong, Linglong
    Jiang, Bei
    Niu, Di
    COMPUTATIONAL STATISTICS, 2021, 36 (01) : 333 - 346
  • [24] Distributed quantile regression in decentralized optimization
    Shen, Lin
    Chao, Yue
    Ma, Xuejun
    INFORMATION SCIENCES, 2023, 643
  • [25] Smoothing quantile regression for a distributed system
    Jiang, Rong
    Yu, Keming
    NEUROCOMPUTING, 2021, 466 : 311 - 326
  • [26] Advanced algorithms for penalized quantile and composite quantile regression
    Matthew Pietrosanu
    Jueyu Gao
    Linglong Kong
    Bei Jiang
    Di Niu
    Computational Statistics, 2021, 36 : 333 - 346
  • [27] DISTRIBUTED INFERENCE FOR QUANTILE REGRESSION PROCESSES
    Volgushev, Stanislav
    Chao, Shih-Kang
    Cheng, Guang
    ANNALS OF STATISTICS, 2019, 47 (03): : 1634 - 1662
  • [28] Variable screening for ultrahigh dimensional censored quantile regression
    Pan, Jing
    Zhang, Shucong
    Zhou, Yong
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2019, 89 (03) : 395 - 413
  • [29] Robust and efficient sparse learning over networks: a decentralized surrogate composite quantile regression approach
    Qiao, Nan
    Chen, Canyi
    Zhu, Zhengtian
    STATISTICS AND COMPUTING, 2025, 35 (01)
  • [30] Efficient sparse portfolios based on composite quantile regression for high-dimensional index tracking
    Li, Ning
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2020, 90 (08) : 1466 - 1478