Distributed adaptive Huber regression

被引:9
|
作者
Luo, Jiyu [1 ]
Sun, Qiang [2 ]
Zhou, Wen-Xin [3 ]
机构
[1] Univ Calif San Diego, Herbert Wertheim Sch Publ Hlth & Human Longev Sci, Div Biostat, San Diego, CA 92093 USA
[2] Univ Toronto, Dept Stat Sci, Toronto, ON M5S 3G3, Canada
[3] Univ Calif San Diego, Dept Math, La Jolla, CA 92093 USA
基金
加拿大自然科学与工程研究理事会; 美国国家科学基金会;
关键词
Adaptive Huber regression; Communication efficiency; Distributed inference; Heavy-tailed distribution; Nonasymptotic analysis; ROBUST REGRESSION; QUANTILE REGRESSION; M-ESTIMATORS; ASYMPTOTIC-BEHAVIOR; LINEAR-REGRESSION; PARAMETERS;
D O I
10.1016/j.csda.2021.107419
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Distributed data naturally arise in scenarios involving multiple sources of observations, each stored at a different location. Directly pooling all the data together is often prohibited due to limited bandwidth and storage, or due to privacy protocols. A new robust distributed algorithm is introduced for fitting linear regressions when data are subject to heavy-tailed and/or asymmetric errors with finite second moments. The algorithm only communicates gradient information at each iteration, and therefore is communication-efficient. To achieve the bias-robustness tradeoff, the key is a novel double-robustification approach that applies on both the local and global objective functions. Statistically, the resulting estimator achieves the centralized nonasymptotic error bound as if all the data were pooled together and came from a distribution with sub-Gaussian tails. Under a finite (2 + delta)-th moment condition, a Berry-Esseen bound for the distributed estimator is established, based on which robust confidence intervals are constructed. In high dimensions, the proposed doubly-robustified loss function is complemented with l(1) -penalization for fitting sparse linear models with distributed data. Numerical studies further confirm that compared with extant distributed methods, the proposed methods achieve near-optimal accuracy with low variability and better coverage with tighter confidence width. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:23
相关论文
共 50 条
  • [31] Diffusion normalized Huber adaptive filtering algorithm
    Li, Zhi
    Guan, Sihai
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2018, 355 (08): : 3812 - 3825
  • [32] Robust Support Vector Regression in Primal with Asymmetric Huber Loss
    S. Balasundaram
    Yogendra Meena
    Neural Processing Letters, 2019, 49 : 1399 - 1431
  • [33] Nonasymptotic analysis of robust regression with modified Huber?s loss
    Tong, Hongzhi
    JOURNAL OF COMPLEXITY, 2023, 76
  • [34] Nonlinear regression Huber-based divided difference filtering
    Li, Wei
    Liu, Meihong
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART G-JOURNAL OF AEROSPACE ENGINEERING, 2017, 231 (05) : 799 - 808
  • [35] Degrees of freedom for regularized regression with Huber loss and linear constraints
    Yongxin Liu
    Peng Zeng
    Lu Lin
    Statistical Papers, 2021, 62 : 2383 - 2405
  • [36] On Regularization Based Twin Support Vector Regression with Huber Loss
    Gupta, Umesh
    Gupta, Deepak
    NEURAL PROCESSING LETTERS, 2021, 53 (01) : 459 - 515
  • [37] An efficient dual ADMM for Huber regression with fused lasso penalty
    Shi, Mengjiao
    Xiao, Yunhai
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2025,
  • [38] Robust Support Vector Regression in Primal with Asymmetric Huber Loss
    Balasundaram, S.
    Meena, Yogendra
    NEURAL PROCESSING LETTERS, 2019, 49 (03) : 1399 - 1431
  • [39] On Regularization Based Twin Support Vector Regression with Huber Loss
    Umesh Gupta
    Deepak Gupta
    Neural Processing Letters, 2021, 53 : 459 - 515
  • [40] Online updating Huber robust regression for big data streams
    Tao, Chunbai
    Wang, Shanshan
    STATISTICS, 2024, 58 (05) : 1197 - 1223