DISTRIBUTED SPARSE COMPOSITE QUANTILE REGRESSION IN ULTRAHIGH DIMENSIONS

被引:3
|
作者
Chen, Canyi [1 ]
Gu, Yuwen [2 ]
Zou, Hui [3 ]
Zhu, Liping [4 ,5 ]
机构
[1] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[2] Univ Connecticut, Dept Stat, Storrs, CT 06269 USA
[3] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
[4] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[5] Renmin Univ China, Ctr Appl Stat, Beijing 100872, Peoples R China
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Composite quantile regression; distributed estimation; ef-ficiency; heavy-tailed noise; support recovery; VARIABLE SELECTION; FRAMEWORK; EFFICIENT;
D O I
10.5705/ss.202022.0095
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We examine distributed estimation and support recovery for ultrahigh dimensional linear regression models under a potentially arbitrary noise distribution. The composite quantile regression is an efficient alternative to the least squares method, and provides robustness against heavy-tailed noise while maintaining reasonable efficiency in the case of light-tailed noise. The highly nonsmooth nature of the composite quantile regression loss poses challenges to both the theoretical and the computational development in an ultrahigh-dimensional distributed estimation setting. Thus, we cast the composite quantile regression into the least squares framework, and propose a distributed algorithm based on an approximate Newton method. This algorithm is efficient in terms of both computation and communication, and requires only gradient information to be communicated between the machines. We show that the resultant distributed estimator attains a near-oracle rate after a constant number of communications, and provide theoretical guarantees for its estimation and support recovery accuracy. Extensive experiments demonstrate the competitive empirical performance of our algorithm.
引用
收藏
页码:1143 / 1167
页数:25
相关论文
共 50 条
  • [41] Composite Hierachical Linear Quantile Regression
    Yan-liang CHEN
    Mao-zai TIAN
    Ke-ming YU
    Jian-xin PAN
    Acta Mathematicae Applicatae Sinica, 2014, (01) : 49 - 64
  • [42] Composite Hierachical Linear Quantile Regression
    Chen, Yan-liang
    Tian, Mao-zai
    Yu, Ke-ming
    Pan, Jian-xin
    ACTA MATHEMATICAE APPLICATAE SINICA-ENGLISH SERIES, 2014, 30 (01): : 49 - 64
  • [43] Risk Estimation With Composite Quantile Regression
    Christou, Eliana
    Grabchak, Michael
    ECONOMETRICS AND STATISTICS, 2025, 33 : 166 - 179
  • [44] Composite quantile regression for correlated data
    Zhao, Weihua
    Lian, Heng
    Song, Xinyuan
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2017, 109 : 15 - 33
  • [45] A frisch-newton algorithm for sparse quantile regression
    Koenker R.
    Ng P.
    Acta Mathematicae Applicatae Sinica, 2005, 21 (2) : 225 - 236
  • [46] Fast learning rates for sparse quantile regression problem
    Lv, Shao-Gao
    Ma, Tie-Feng
    Liu, Liu
    Feng, Yun-Long
    NEUROCOMPUTING, 2013, 108 : 13 - 22
  • [47] Quantile function regression and variable selection for sparse models
    Yoshida, Takuma
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2021, 49 (04): : 1196 - 1221
  • [48] Incorporating Graphical Structure of Predictors in Sparse Quantile Regression
    Wang, Zhanfeng
    Liu, Xianhui
    Tang, Wenlu
    Lin, Yuanyuan
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2021, 39 (03) : 783 - 792
  • [49] A Frisch-Newton Algorithm for Sparse Quantile Regression
    Roger Koenker
    Pin Ng
    Acta Mathematicae Applicatae Sinica(English Series), 2005, (02) : 225 - 236
  • [50] Distributed quantile regression for massive heterogeneous data
    Hu, Aijun
    Jiao, Yuling
    Liu, Yanyan
    Shi, Yueyong
    Wu, Yuanshan
    NEUROCOMPUTING, 2021, 448 : 249 - 262