Neural-network construction and selection in nonlinear modeling

被引:70
|
作者
Rivals, I [1 ]
Personnaz, L [1 ]
机构
[1] Ecole Super Phys & Chim Ind Ville Paris, Equipe Stat Appl, F-75231 Paris 05, France
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2003年 / 14卷 / 04期
关键词
growing and pruning procedures; ill-conditioning detection; input selection; least squares (LS) estimation; leave-one-out cross validation; linear Taylor expansion; model selection; neural networks; nonlinear regression; statistical hypothesis tests; ALGORITHMS; VALIDATION;
D O I
10.1109/TNN.2003.811356
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we study how statistical tools which are commonly used independently can advantageously be exploited together in order to improve neural network estimation and selection in nonlinear static modeling. The tools we consider are the analysis of the numerical conditioning of the neural network candidates, statistical hypothesis tests, and cross validation. We present and analyze each of these tools in order to justify at what stage of a construction and selection procedure they can be most useful. On the basis of this analysis we then propose a novel and systematic construction and selection procedure for neural modeling. We finally illustrate its efficiency through large-scale simulations experiments and real-world modeling problems.
引用
收藏
页码:804 / 819
页数:16
相关论文
共 50 条