Bounds for the computational power and learning complexity of analog neural nets

被引:16
|
作者
Maass, W
机构
[1] Inst. for Theor. Computer Science, Technische Universität Graz, A-8010 Graz
关键词
neural networks; analog computing; threshold circuits; circuit complexity; learning complexity;
D O I
10.1137/S0097539793256041
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
It is shown that high-order feedforward neural nets of constant depth with piecewise-polynomial activation functions and arbitrary real weights can be simulated for Boolean inputs and outputs by neural nets of a somewhat larger size and depth with Heaviside gates and weights from {-1, 0, 1}. This provides the first known upper bound for the computational power of the former type of neural nets. It is also shown that in the case of first-order nets with piecewise-linear activation functions one can replace arbitrary real weights by rational numbers with polynomially many bits without changing the Boolean function that is computed by the neural net. In order to prove these results, we introduce two new methods for reducing nonlinear problems about weights in multilayer neural nets to linear problems for a transformed set of parameters. These transformed parameters can be interpreted as weights in a somewhat larger neural net. As another application of our new proof technique we show that neural nets with piecewise-polynomial activation functions and a constant number of analog inputs are probably approximately correct (PAC) learnable (in Valiant's model for PAC learning [Comm. Assoc. Comput. Mach., 27 (1984), pp. 1134-1142]).
引用
收藏
页码:708 / 732
页数:25
相关论文
共 50 条
  • [31] Learning algorithms for probabilistic neural nets
    Myers, Catherine
    Aleksander, Igor
    Neural Networks, 1988, 1 (1 SUPPL)
  • [32] Complexity Bounds for Batch Active Learning in Classification
    Rolet, Philippe
    Teytaud, Olivier
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2010, 6323 : 293 - 305
  • [33] AN ANALOG MOS IMPLEMENTATION OF THE SYNAPTIC WEIGHTS FOR FEEDBACK NEURAL NETS
    SALAM, FMA
    KHACHAB, N
    ISMAIL, M
    WANG, Y
    1989 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-3, 1989, : 1223 - 1226
  • [34] Computational complexity and bounds for neighbor-scattering number of graphs
    Li, FW
    Li, XL
    8th International Symposium on Parallel Architectures, Algorithms and Networks, Proceedings, 2005, : 478 - 483
  • [35] Formally Verified Resource Bounds through Implicit Computational Complexity
    Rusch, Neea
    COMPANION PROCEEDINGS OF THE 2022 ACM SIGPLAN INTERNATIONAL CONFERENCE ON SYSTEMS, PROGRAMMING, LANGUAGES, AND APPLICATIONS: SOFTWARE FOR HUMANITY, SPLASH COMPANION 2022, 2022, : 17 - 20
  • [36] Upper and Lower Bounds on the Computational Complexity of Polar Encoding and Decoding
    Blake, Christopher G.
    Kschischang, Frank R.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (09) : 5656 - 5673
  • [37] Fast learning neural nets with adaptive learning styles
    Palmer-Brown, D
    Lee, SW
    Tepper, J
    Roadknight, C
    ESM 2003: 17TH EUROPEAN SIMULATION MULTICONFERENCE: FOUNDATIONS FOR SUCCESSFUL MODELLING & SIMULATION, 2003, : 118 - 123
  • [38] On the Computational Complexity of Compressed Power Series
    E. A. Karatsuba
    Mathematical Notes, 2023, 114 : 92 - 98
  • [39] Parametric Complexity Bounds for Approximating PDEs with Neural Networks
    Marwah, Tanya
    Lipton, Zachary C.
    Risteski, Andrej
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [40] COMPUTATIONAL COMPLEXITY IN POWER-SYSTEMS
    ALVARADO, FL
    IEEE TRANSACTIONS ON POWER APPARATUS AND SYSTEMS, 1976, 95 (04): : 1028 - 1037