A simple method for selection of inputs and structure of feedforward neural networks

被引:1
|
作者
Saxén, H [1 ]
Pettersson, F [1 ]
机构
[1] Abo Akad Univ, Heat Engn Lab, Turku, Finland
关键词
D O I
10.1007/3-211-27389-1_3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When feedforward neural networks of multi-layer perceptron (MLP) type are used as black-box models of complex processes, a common problem is how to select relevant inputs from a large set of potential variables that affect the outputs to be modeled. If, furthermore, the observations of the input-output tuples are scarce, the degrees of freedom may not allow for the use of a fully connected layer between the inputs and the hidden nodes. This paper presents a systematic method for selection of both input variables and a constrained connectivity of the lower-layer weights in MLPs. The method, which can also be used as a means to provide initial guesses for the weights prior to the final training phase of the MLPs, is illustrated on a class of test problems.
引用
收藏
页码:9 / 12
页数:4
相关论文
共 50 条
  • [1] Method for the selection of inputs and structure of feedforward neural networks
    Saxen, H.
    Pettersson, F.
    [J]. COMPUTERS & CHEMICAL ENGINEERING, 2006, 30 (6-7) : 1038 - 1045
  • [2] A simple method to estimate the size of feedforward neural networks
    Li, YJ
    [J]. 2003 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOLS 1-5, CONFERENCE PROCEEDINGS, 2003, : 1322 - 1326
  • [3] Convergence of an online gradient method for feedforward neural networks with stochastic inputs
    Li, ZX
    Wu, W
    Tian, YL
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2004, 163 (01) : 165 - 176
  • [4] On learning feedforward neural networks with noise injection into inputs
    Seghouane, AK
    Moudden, Y
    Fleury, G
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING XII, PROCEEDINGS, 2002, : 149 - 158
  • [5] CONVERGENCE OF ONLINE GRADIENT METHOD WITH A PENALTY TERM FOR FEEDFORWARD NEURAL NETWORKS WITH STOCHASTIC INPUTS
    邵红梅
    吴微
    李峰
    [J]. Numerical Mathematics(Theory,Methods and Applications), 2005, (01) : 87 - 96
  • [6] Generalization and selection of examples in feedforward neural networks
    Franco, L
    Cannas, SA
    [J]. NEURAL COMPUTATION, 2000, 12 (10) : 2405 - 2426
  • [7] Model structure selection for nonlinear system identification using feedforward neural networks
    Petrovic, I
    Baotic, M
    Peric, N
    [J]. IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL I, 2000, : 53 - 57
  • [8] Using mutual information index for inputs selection in feedforward neural network
    Fei, Yunjie
    Deng, Wei
    Su, Meijuan
    [J]. DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2007, 14 : 207 - 209
  • [9] A cost function for learning feedforward neural networks subject to noisy inputs
    Seghouane, AK
    Fleury, G
    [J]. ISSPA 2001: SIXTH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOLS 1 AND 2, PROCEEDINGS, 2001, : 386 - 389
  • [10] BAYESIAN SELECTION OF IMPORTANT FEATURES FOR FEEDFORWARD NEURAL NETWORKS
    PRIDDY, KL
    ROGERS, SK
    RUCK, DW
    TARR, GL
    KABRISKY, M
    [J]. NEUROCOMPUTING, 1993, 5 (2-3) : 91 - 103