The complexity of model classes, and smoothing noisy data

被引:6
|
作者
Bartlett, PL [1 ]
Kulkarni, SR
机构
[1] Australian Natl Univ, Dept Syst Engn, Res Sch Informat Sci & Engn, Canberra, ACT 0200, Australia
[2] Princeton Univ, Dept Elect Engn, Princeton, NJ 08544 USA
关键词
system identification; computational learning theory; smoothing; covering numbers;
D O I
10.1016/S0167-6911(98)00008-5
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the problem of smoothing a sequence of noisy observations using a fixed class of models. Via a deterministic analysis, we obtain necessary and sufficient conditions on the noise sequence and model class that ensure that a class of natural estimators gives near-optimal smoothing. In the case of i.i.d. random noise, we show that the accuracy of these estimators depends on a measure of complexity of the model class involving covering numbers. Our formulation and results are quite general and are related to a number of problems in learning, prediction, and estimation. As a special case, we consider an application to output smoothing for certain classes of linear and nonlinear systems. The performance of output smoothing is given in terms of natural complexity parameters of the model class, such as bounds on the order of linear systems, the l(1)-norm of the impulse response of stable linear systems, or the memory of a Lipschitz nonlinear system satisfying a fading memory condition. (C) 1998 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:133 / 140
页数:8
相关论文
共 50 条