FINITE SAMPLE PERFORMANCE OF LINEAR LEAST SQUARES ESTIMATORS UNDER SUB-GAUSSIAN MARTINGALE DIFFERENCE NOISE

被引:0
|
作者
Krikheli, Michael [1 ]
Leshem, Amir [1 ]
机构
[1] Bar Ilan Univ, Fac Engn, IL-52900 Ramat Gan, Israel
关键词
Estimation; linear least squares; non-Gaussian; concentration bounds; finite sample; large deviations; confidence bounds; martingale difference sequence; INTERPOLATION; MODELS;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Linear Least Squares is a very well known technique for parameter estimation, which is used even when sub-optimal, because of its very low computational requirements and the fact that exact knowledge of the noise statistics is not required. Surprisingly, bounding the probability of large errors with finitely many samples has been left open, especially when dealing with correlated noise with unknown covariance. In this paper we analyze the finite sample performance of the linear least squares estimator under sub-Gaussian martingale difference noise. In order to analyze this important question we used concentration of measure bounds. When applying these bounds we obtained tight bounds on the tail of the estimator's distribution. We show the fast exponential convergence of the number of samples required to ensure a given accuracy with high probability. We provide probability tail bounds on the estimation error's norm. Our analysis method is simple and uses simple L-infinity type bounds on the estimation error. The tightness of the bounds is tested through simulation. The proposed bounds make it possible to predict the number of samples required for least squares estimation even when least squares is sub-optimal and used for computational simplicity. The finite sample analysis of least squares models with this general noise model is novel.
引用
收藏
页码:4444 / 4448
页数:5
相关论文
共 30 条