Quantile regression with ReLU Networks: Estimators and minimax rates

被引:0
|
作者
Padilla, Oscar Hernan Madrid [1 ]
Tansey, Wesley
Chen, Yanzhen [2 ]
机构
[1] Dept Stat Univeristy Calif, Angeles 520 Portola Plaza, Los Angeles, CA USA
[2] Dept Informat Syst, Business Stat & Operat Management Hong Kong Univ S, Hong Kong, Peoples R China
关键词
Deep networks; robust regression; minimax; sparse networks; NEURAL-NETWORK; FEEDFORWARD NETWORKS; APPROXIMATION; BOUNDS; ERROR;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Quantile regression is the task of estimating a specified percentile response, such as the median (50th percentile), from a collection of known covariates. We study quantile re-gression with rectified linear unit (ReLU) neural networks as the chosen model class. We derive an upper bound on the expected mean squared error of a ReLU network used to estimate any quantile conditioning on a set of covariates. This upper bound only de-pends on the best possible approximation error, the number of layers in the network, and the number of nodes per layer. We further show upper bounds that are tight for two large classes of functions: compositions of Hclder functions and members of a Besov space. These tight bounds imply ReLU networks with quantile regression achieve mini-max rates for broad collections of function types. Unlike existing work, the theoretical results hold under minimal assumptions and apply to general error distributions, including heavy-tailed distributions. Empirical simulations on a suite of synthetic response func-tions demonstrate the theoretical results translate to practical implementations of ReLU networks. Overall, the theoretical and empirical results provide insight into the strong performance of ReLU neural networks for quantile regression across a broad range of function classes and error distributions. All code for this paper is publicly available at https://github.com/tansey/quantile-regression.
引用
收藏
页数:42
相关论文
共 50 条
  • [31] A comparison of local constant and local linear regression quantile estimators
    Yu, KM
    Jones, MC
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 1997, 25 (02) : 159 - 166
  • [32] Convergence rates of deep ReLU networks for multiclass classification
    Bos, Thijs
    Schmidt-Hieber, Johannes
    ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (01): : 2724 - 2773
  • [33] Deep Huber quantile regression networks
    Tyralis, Hristos
    Papacharalampous, Georgia
    Dogulu, Nilay
    Chun, Kwok P.
    NEURAL NETWORKS, 2025, 187
  • [34] Federated Quantile Regression over Networks
    Huang, Liqi
    Wei, Xin
    Zhu, Peikang
    Gao, Yun
    Chen, Mingkai
    Kang, Bin
    2020 16TH INTERNATIONAL WIRELESS COMMUNICATIONS & MOBILE COMPUTING CONFERENCE, IWCMC, 2020, : 57 - 62
  • [35] Quadratic mixed integer programming models in minimax robust regression estimators
    Zioutas, G
    THEORY AND APPLICATION OF RECENT ROBUST METHODS, 2004, : 387 - 400
  • [36] The Minimax Estimator of Stochastic Regression Coefficients and Parameters in the Class of All Estimators
    Li Wen XU Department of Mathematics
    ActaMathematicaSinica(EnglishSeries), 2007, 23 (03) : 497 - 506
  • [37] The Minimax Estimator of Stochastic Regression Coefficients and Parameters in the Class of All Estimators
    Li Wen Xu
    Song Gui Wang
    Acta Mathematica Sinica, English Series, 2007, 23 : 497 - 506
  • [38] The minimax estimator of stochastic regression coefficients and parameters in the class of all estimators
    Xu, Li Wen
    Wang, Song Gui
    ACTA MATHEMATICA SINICA-ENGLISH SERIES, 2007, 23 (03) : 497 - 506
  • [39] Robust nonparametric regression based on deep ReLU neural networks
    Chen, Juntong
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2024, 233
  • [40] Rates of strong consistency for nonparametric regression estimators
    Blondin, D
    Massiani, A
    Ribereau, P
    COMPTES RENDUS MATHEMATIQUE, 2005, 340 (07) : 525 - 528