共 50 条
Quantile regression with ReLU Networks: Estimators and minimax rates
被引:0
|作者:
Padilla, Oscar Hernan Madrid
[1
]
Tansey, Wesley
Chen, Yanzhen
[2
]
机构:
[1] Dept Stat Univeristy Calif, Angeles 520 Portola Plaza, Los Angeles, CA USA
[2] Dept Informat Syst, Business Stat & Operat Management Hong Kong Univ S, Hong Kong, Peoples R China
关键词:
Deep networks;
robust regression;
minimax;
sparse networks;
NEURAL-NETWORK;
FEEDFORWARD NETWORKS;
APPROXIMATION;
BOUNDS;
ERROR;
D O I:
暂无
中图分类号:
TP [自动化技术、计算机技术];
学科分类号:
0812 ;
摘要:
Quantile regression is the task of estimating a specified percentile response, such as the median (50th percentile), from a collection of known covariates. We study quantile re-gression with rectified linear unit (ReLU) neural networks as the chosen model class. We derive an upper bound on the expected mean squared error of a ReLU network used to estimate any quantile conditioning on a set of covariates. This upper bound only de-pends on the best possible approximation error, the number of layers in the network, and the number of nodes per layer. We further show upper bounds that are tight for two large classes of functions: compositions of Hclder functions and members of a Besov space. These tight bounds imply ReLU networks with quantile regression achieve mini-max rates for broad collections of function types. Unlike existing work, the theoretical results hold under minimal assumptions and apply to general error distributions, including heavy-tailed distributions. Empirical simulations on a suite of synthetic response func-tions demonstrate the theoretical results translate to practical implementations of ReLU networks. Overall, the theoretical and empirical results provide insight into the strong performance of ReLU neural networks for quantile regression across a broad range of function classes and error distributions. All code for this paper is publicly available at https://github.com/tansey/quantile-regression.
引用
收藏
页数:42
相关论文