Bounds for the computational power and learning complexity of analog neural nets

被引:16
|
作者
Maass, W
机构
[1] Inst. for Theor. Computer Science, Technische Universität Graz, A-8010 Graz
关键词
neural networks; analog computing; threshold circuits; circuit complexity; learning complexity;
D O I
10.1137/S0097539793256041
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
It is shown that high-order feedforward neural nets of constant depth with piecewise-polynomial activation functions and arbitrary real weights can be simulated for Boolean inputs and outputs by neural nets of a somewhat larger size and depth with Heaviside gates and weights from {-1, 0, 1}. This provides the first known upper bound for the computational power of the former type of neural nets. It is also shown that in the case of first-order nets with piecewise-linear activation functions one can replace arbitrary real weights by rational numbers with polynomially many bits without changing the Boolean function that is computed by the neural net. In order to prove these results, we introduce two new methods for reducing nonlinear problems about weights in multilayer neural nets to linear problems for a transformed set of parameters. These transformed parameters can be interpreted as weights in a somewhat larger neural net. As another application of our new proof technique we show that neural nets with piecewise-polynomial activation functions and a constant number of analog inputs are probably approximately correct (PAC) learnable (in Valiant's model for PAC learning [Comm. Assoc. Comput. Mach., 27 (1984), pp. 1134-1142]).
引用
收藏
页码:708 / 732
页数:25
相关论文
共 50 条
  • [21] THE COMPUTATIONAL-COMPLEXITY OF QUERYING BOUNDS ON DIFFERENCES CONSTRAINTS
    BRUSONI, V
    CONSOLE, L
    TERENZIANI, P
    ARTIFICIAL INTELLIGENCE, 1995, 74 (02) : 367 - 379
  • [22] Strong computational lower bounds via parameterized complexity
    Chen, Jianer
    Huang, Xiuzhen
    Kanj, Iyad A.
    Xia, Ge
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2006, 72 (08) : 1346 - 1367
  • [23] Two lower bounds on computational complexity of infinite words
    Hromkovic, J
    Karhumaki, J
    NEW TRENDS IN FORMAL LANGUAGES: CONTROL, COOPERATION, AND COMBINATORICS, 1997, 1218 : 366 - 376
  • [24] REMARKS ON THE FREQUENCY-CODED NEURAL NETS COMPLEXITY
    SKODNY, P
    LECTURE NOTES IN COMPUTER SCIENCE, 1990, 464 : 244 - 250
  • [25] The neural dynamics associated with computational complexity
    Franco, Juan Pablo
    Bossaerts, Peter
    Murawski, Carsten
    PLOS COMPUTATIONAL BIOLOGY, 2024, 20 (09)
  • [26] Integrability, neural and quantum computational complexity
    Krishnamurthy, EV
    Krishnamurthy, V
    WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL 1, PROCEEDINGS: ISAS '98, 1998, : 547 - 554
  • [27] On the computational power of Timed Differentiable Petri Nets
    Haddad, Serge
    Recalde, Laura
    Silva, Manuel
    FORMAL MODELING AND ANALYSIS OF TIMED SYSTEMS, 2006, 4202 : 230 - 244
  • [28] Neural nets are now for power generation
    不详
    CHEMICAL ENGINEERING, 1996, 103 (12) : 127 - 127
  • [29] Minimax Lower Bounds for Ridge Combinations Including Neural Nets
    Klusowski, Jason M.
    Barron, Andrew R.
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 1376 - 1380
  • [30] Latent learning in Deep Neural Nets
    Gutstein, Steven
    Fuented, Olac
    Freudenthal, Eric
    2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010, 2010,