Minimization of error functionals over perceptron networks

被引:12
|
作者
Kurkova, Vera [1 ]
机构
[1] Acad Sci Czech Republic, Inst Comp Sci, Prague 18207, Czech Republic
关键词
D O I
10.1162/neco.2008.20.1.252
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Supervised learning of perceptron networks is investigated as an optimization problem. It is shown that both the theoretical and the empirical error functionals achieve minima over sets of functions computable by networks with a given number n of perceptrons. Upper bounds on rates of convergence of these minima with n increasing are derived. The bounds depend on a certain regularity of training data expressed in terms of variational norms of functions interpolating the data (in the case of the empirical error) and the regression function (in the case of the expected error). Dependence of this type of regularity on dimensionality and on magnitudes of partial derivatives is investigated. Conditions on the data, which guarantee that a good approximation of global minima of error functionals can be achieved using networks with a limited complexity, are derived. The conditions are in terms of oscillatory behavior of the data measured by the product of a function of the number of variables d, which is decreasing exponentially fast, and the maximum of the magnitudes of the squares of the L(1)-norms of the iterated partial derivatives of the order d of the regression function or some function, which interpolates the sample of the data. The results are illustrated by examples of data with small and high regularity constructed using Boolean functions and the gaussian function.
引用
收藏
页码:252 / 270
页数:19
相关论文
共 50 条
  • [1] Minimization of empirical error over perceptron networks
    Kurková, V
    ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, 2005, : 46 - 49
  • [2] Minimization of error functionals over variable-basis functions
    Kainen, PC
    Kurková, V
    Sanguineti, M
    SIAM JOURNAL ON OPTIMIZATION, 2004, 14 (03) : 732 - 742
  • [3] Minimization of convex functionals over frame operators
    Pedro Massey
    Mariano Ruiz
    Advances in Computational Mathematics, 2010, 32 : 131 - 153
  • [4] Minimization of convex functionals over frame operators
    Massey, Pedro
    Ruiz, Mariano
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2010, 32 (02) : 131 - 153
  • [5] Classification error of multilayer perceptron neural networks
    Feng, Lihua
    Hong, Weihu
    NEURAL COMPUTING & APPLICATIONS, 2009, 18 (04): : 377 - 380
  • [6] Classification error of multilayer perceptron neural networks
    Lihua Feng
    Weihu Hong
    Neural Computing and Applications, 2009, 18
  • [7] Designs in nonlinear regression by stochastic minimization of functionals of the mean square error matrix
    Gauchi, JP
    Pázman, A
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2006, 136 (03) : 1135 - 1152
  • [8] Localized generalization error model for Multilayer Perceptron Neural Networks
    Yang, Fei
    Ng, Wing W. Y.
    Tsang, Eric C. C.
    Zeng, Xiao-Qin
    Yeung, Daniel S.
    PROCEEDINGS OF 2008 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2008, : 794 - +
  • [9] ON THE MINIMIZATION OF SYMMETRICAL FUNCTIONALS
    CARLEN, EA
    LOSS, M
    REVIEWS IN MATHEMATICAL PHYSICS, 1994, 6 (5A) : 1011 - 1032
  • [10] On the minimization of convex functionals
    Reich, S
    Zaslavski, AJ
    CALCULUS OF VARIATIONS AND DIFFERENTIAL EQUATIONS, 2000, 410 : 200 - 209