Fast bootstrap methodology for regression model selection

被引:12
|
作者
Lendasse, A [1 ]
Simon, G
Wertz, V
Verleysen, M
机构
[1] Aalto Univ, CIS, FI-02015 Helsinki, Finland
[2] Catholic Univ Louvain, Machine Learning Grp, DICE, B-1348 Louvain, Belgium
[3] Catholic Univ Louvain, Machine Learning Grp, CESAME, B-1348 Louvain, Belgium
[4] Univ Paris 01, SAMOS MATISSE, F-75634 Paris, France
基金
芬兰科学院;
关键词
model selection; nonlinear modeling; bootstrap; resampling;
D O I
10.1016/j.neucom.2004.11.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Using resampling methods like cross-validation and bootstrap is a necessity in neural network design, for solving the problem of model structure selection. The bootstrap is a powerful method offering a low variance of the model generalization error estimate. Unfortunately, its computational load may be excessive when used to select among neural networks models of different structures or complexities. This paper presents the fast bootstrap (FB) methodology to select the best model structure; this methodology is applied here to regression tasks. The fast bootstrap assumes that the computationally expensive term estimated by the bootstrap, the optimism, is usually a smooth function (low-order polynomial) of the complexity parameter. Approximating the optimism term makes it possible to considerably reduce the necessary number of simulations. The FB methodology is illustrated on multi-layer perceptrons, radial-basis function networks and least-square support vector machines. (c) 2004 Published by Elsevier B.V.
引用
收藏
页码:161 / 181
页数:21
相关论文
共 50 条