CONSISTENCIES AND RATES OF CONVERGENCE OF JUMP-PENALIZED LEAST SQUARES ESTIMATORS

被引:106
|
作者
Boysen, Leif [1 ]
Kempe, Angela [2 ]
Liebscher, Volkmar [3 ]
Munk, Axel [1 ]
Wittich, Olaf [4 ]
机构
[1] Univ Gottingen, Inst Stat Math, D-37073 Gottingen, Germany
[2] GSF, Natl Res Ctr Environm, Inst Biomath & Biometry, D-85764 Neuherberg, Germany
[3] Univ Greifswald, Dept Math & Comp Sci, D-17487 Greifswald, Germany
[4] Tech Univ Eindhoven, Dept Math & Comp Sci, NL-5600 MB Eindhoven, Netherlands
来源
ANNALS OF STATISTICS | 2009年 / 37卷 / 01期
关键词
Jump detection; adaptive estimation; penalized maximum likelihood; approximation spaces; change-point analysis; multiscale resolution analysis; Potts functional; nonparametric regression; regressogram; Skorokhod topology; variable selection; LARGE UNDERDETERMINED SYSTEMS; BAYESIAN RESTORATION; CHANGE-POINTS; REGRESSION; SHRINKAGE; SMOOTHERS; EQUATIONS; SEQUENCE;
D O I
10.1214/07-AOS558
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regression function by piecewise constant functions. Besides conventional consistency and convergence rates of the estimates in L-2([0, 1)) our results cover other metrics like Skorokhod metric on the space of cadlag functions and uniform metrics on C([0, 1]). We will show that these estimators are in an adaptive sense rate optimal over certain classes of "approximation spaces." Special cases are the class of functions of bounded variation (piecewise) Holder continuous functions of order 0 < alpha <= 1 and the class of step functions with a finite but arbitrary number of jumps. In the latter setting, we will also deduce the rates known from change-point analysis for detecting the jumps. Finally, the issue of fully automatic selection of the smoothing parameter is addressed.
引用
收藏
页码:157 / 183
页数:27
相关论文
共 50 条