DISTRIBUTED SPARSE COMPOSITE QUANTILE REGRESSION IN ULTRAHIGH DIMENSIONS

被引:3
|
作者
Chen, Canyi [1 ]
Gu, Yuwen [2 ]
Zou, Hui [3 ]
Zhu, Liping [4 ,5 ]
机构
[1] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[2] Univ Connecticut, Dept Stat, Storrs, CT 06269 USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
[4] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[5] Renmin Univ China, Ctr Appl Stat, Beijing 100872, Peoples R China
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Composite quantile regression; distributed estimation; ef-ficiency; heavy-tailed noise; support recovery; VARIABLE SELECTION; FRAMEWORK; EFFICIENT;
D O I
10.5705/ss.202022.0095
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We examine distributed estimation and support recovery for ultrahigh dimensional linear regression models under a potentially arbitrary noise distribution. The composite quantile regression is an efficient alternative to the least squares method, and provides robustness against heavy-tailed noise while maintaining reasonable efficiency in the case of light-tailed noise. The highly nonsmooth nature of the composite quantile regression loss poses challenges to both the theoretical and the computational development in an ultrahigh-dimensional distributed estimation setting. Thus, we cast the composite quantile regression into the least squares framework, and propose a distributed algorithm based on an approximate Newton method. This algorithm is efficient in terms of both computation and communication, and requires only gradient information to be communicated between the machines. We show that the resultant distributed estimator attains a near-oracle rate after a constant number of communications, and provide theoretical guarantees for its estimation and support recovery accuracy. Extensive experiments demonstrate the competitive empirical performance of our algorithm.
引用
收藏
页码:1143 / 1167
页数:25
相关论文
共 50 条
  • [1] Sparse Composite Quantile Regression in Ultrahigh Dimensions With Tuning Parameter Calibration
    Gu, Yuwen
    Zou, Hui
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2020, 66 (11) : 7132 - 7154
  • [2] SPARSE COMPOSITE QUANTILE REGRESSION WITH ULTRAHIGH-DIMENSIONAL HETEROGENEOUS DATA
    Qu, Lianqiang
    Hao, Meiling
    Sun, Liuquan
    STATISTICA SINICA, 2022, 32 (01) : 459 - 475
  • [3] Communication-efficient sparse composite quantile regression for distributed data
    Yang, Yaohong
    Wang, Lei
    METRIKA, 2023, 86 (03) : 261 - 283
  • [4] Communication-efficient sparse composite quantile regression for distributed data
    Yaohong Yang
    Lei Wang
    Metrika, 2023, 86 : 261 - 283
  • [5] Debiased distributed quantile regression in high dimensions
    He, Yiran
    Chen, Canyi
    Xu, Wangli
    STATISTICS AND ITS INTERFACE, 2024, 17 (03) : 337 - 347
  • [6] Distributed Quantile Regression with Non-Convex Sparse Penalties
    Mirzaeifard, Reza
    Gogineni, Vinay Chakravarthi
    Venkategowda, Naveen K. D.
    Werner, Stefan
    2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 250 - 254
  • [7] Sparse quantile regression
    Chen, Le-Yu
    Lee, Sokbae
    JOURNAL OF ECONOMETRICS, 2023, 235 (02) : 2195 - 2217
  • [8] Sparse and debiased lasso estimation and inference for high-dimensional composite quantile regression with distributed data
    Hou, Zhaohan
    Ma, Wei
    Wang, Lei
    TEST, 2023, 32 (04) : 1230 - 1250
  • [9] Composite quantile regression for a distributed system with non-randomly distributed data
    Jin, Jun
    Hao, Chenyan
    Chen, Yewen
    STATISTICAL PAPERS, 2025, 66 (01)
  • [10] Multi-round smoothed composite quantile regression for distributed data
    Di, Fengrui
    Wang, Lei
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2022, 74 (05) : 869 - 893