Structural synthesis of fast two-layer neural networks

被引:1
|
作者
Dorogov, AY [1 ]
机构
[1] St Petersburg State Electrotech Univ, St Petersburg, Russia
关键词
neural networks; two-layer neural networks; fast neural networks (FNNs); dense neural networks; one-rank networks; fast two-layer neural networks; number of degrees of freedom of neural networks; plasticity (trainability) of neural networks; structural synthesis of fast two-layer neural networks;
D O I
10.1007/BF02667059
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Methods of construction of structural models of fast two-layer neural networks are considered. The methods are based on the criteria of minimum computing operations and maximum degrees of freedom. Optimal structural models of two-layer neural networks are constructed. Illustrative examples are given.
引用
收藏
页码:512 / 519
页数:8
相关论文
共 50 条
  • [11] Benign Overfitting in Two-layer Convolutional Neural Networks
    Cao, Yuan
    Chen, Zixiang
    Belkin, Mikhail
    Gu, Quanquan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [12] Sharp asymptotics on the compression of two-layer neural networks
    Amani, Mohammad Hossein
    Bombari, Simone
    Mondelli, Marco
    Pukdee, Rattana
    Rini, Stefano
    2022 IEEE INFORMATION THEORY WORKSHOP (ITW), 2022, : 588 - 593
  • [13] On the symmetries in the dynamics of wide two-layer neural networks
    Hajjar, Karl
    Chizat, Lenaic
    ELECTRONIC RESEARCH ARCHIVE, 2023, 31 (04): : 2175 - 2212
  • [14] TWO-LAYER NEURAL NETWORKS WITH VALUES IN A BANACH SPACE
    Korolev, Yury
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2022, 54 (06) : 6358 - 6389
  • [15] Learning behavior and temporary minima of two-layer neural networks
    Annema, Anne-Johan
    Hoen, Klaas
    Wallinga, Hans
    Neural Networks, 1994, 7 (09): : 1387 - 1404
  • [16] l1 Regularization in Two-Layer Neural Networks
    Li, Gen
    Gu, Yuantao
    Ding, Jie
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 135 - 139
  • [17] Convergence Analysis of Two-layer Neural Networks with ReLU Activation
    Li, Yuanzhi
    Yuan, Yang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [18] APPROXIMATION PROPERTIES OF SOME TWO-LAYER FEEDFORWARD NEURAL NETWORKS
    Nowak, Michal A.
    OPUSCULA MATHEMATICA, 2007, 27 (01) : 59 - 72
  • [19] A PRIORI ESTIMATES OF THE POPULATION RISK FOR TWO-LAYER NEURAL NETWORKS
    E, Weinan
    Ma, Chao
    Wu, Lei
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2019, 17 (05) : 1407 - 1425
  • [20] A Global Universality of Two-Layer Neural Networks with ReLU Activations
    Hatano, Naoya
    Ikeda, Masahiro
    Ishikawa, Isao
    Sawano, Yoshihiro
    JOURNAL OF FUNCTION SPACES, 2021, 2021