Bounds for the computational power and learning complexity of analog neural nets

被引:16
|
作者
Maass, W
机构
[1] Inst. for Theor. Computer Science, Technische Universität Graz, A-8010 Graz
关键词
neural networks; analog computing; threshold circuits; circuit complexity; learning complexity;
D O I
10.1137/S0097539793256041
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
It is shown that high-order feedforward neural nets of constant depth with piecewise-polynomial activation functions and arbitrary real weights can be simulated for Boolean inputs and outputs by neural nets of a somewhat larger size and depth with Heaviside gates and weights from {-1, 0, 1}. This provides the first known upper bound for the computational power of the former type of neural nets. It is also shown that in the case of first-order nets with piecewise-linear activation functions one can replace arbitrary real weights by rational numbers with polynomially many bits without changing the Boolean function that is computed by the neural net. In order to prove these results, we introduce two new methods for reducing nonlinear problems about weights in multilayer neural nets to linear problems for a transformed set of parameters. These transformed parameters can be interpreted as weights in a somewhat larger neural net. As another application of our new proof technique we show that neural nets with piecewise-polynomial activation functions and a constant number of analog inputs are probably approximately correct (PAC) learnable (in Valiant's model for PAC learning [Comm. Assoc. Comput. Mach., 27 (1984), pp. 1134-1142]).
引用
收藏
页码:708 / 732
页数:25
相关论文
共 50 条
  • [1] ON THE COMPUTATIONAL POWER OF NEURAL NETS
    SIEGELMANN, HT
    SONTAG, ED
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1995, 50 (01) : 132 - 150
  • [2] On the computational complexity of binary and analog symmetric hopfield nets
    Síma, J
    Orponen, P
    Antti-Poika, T
    NEURAL COMPUTATION, 2000, 12 (12) : 2965 - 2989
  • [3] COMPLEXITY RESULTS ON LEARNING BY NEURAL NETS
    LIN, JH
    VITTER, JS
    MACHINE LEARNING, 1991, 6 (03) : 211 - 230
  • [4] AGNOSTIC PAC LEARNING OF FUNCTIONS ON ANALOG NEURAL NETS
    MAASS, W
    NEURAL COMPUTATION, 1995, 7 (05) : 1054 - 1078
  • [5] On the Computational Complexity of Learning Bithreshold Neural Units and Networks
    Kotsovsky, Vladyslav
    Geche, Fedir
    Batyuk, Anatoliy
    LECTURE NOTES IN COMPUTATIONAL INTELLIGENCE AND DECISION MAKING, 2020, 1020 : 189 - 202
  • [6] Computational Complexity of Learning Neural Networks: Smoothness and Degeneracy
    Daniely, Amit
    Srebro, Nathan
    Vardi, Gal
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] Computational power of neural networks: A characterization in terms of Kolmogorov complexity
    Balcazar, JL
    Gavalda, R
    Siegelmann, HT
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1997, 43 (04) : 1175 - 1183
  • [8] Computational power of neuroidal nets
    Wiedermann, J
    SOFSEM'99: THEORY AND PRACTICE OF INFORMATICS, 1999, 1725 : 479 - 487
  • [9] Analog computational power
    Kain, RY
    SCIENCE, 1996, 271 (5245) : 92 - 92
  • [10] Stochastic analog networks and computational complexity
    Siegelmann, HT
    JOURNAL OF COMPLEXITY, 1999, 15 (04) : 451 - 475