Neural Hilbert Ladders: Multi-Layer Neural Networks in Function Space

被引:0
|
作者
Chen, Zhengdao [1 ]
机构
[1] Google Res, Mountain View, CA 94043 USA
关键词
function space of neural networks; feature learning; approximation bound; Rademacher complexity; depth separation; mean -field limit; BOUNDS; ERROR;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To characterize the function space explored by neural networks (NNs) is an important aspect of learning theory. In this work, noticing that a multi-layer NN generates implicitly a hierarchy of reproducing kernel Hilbert spaces (RKHSs)-named a neural Hilbert ladder (NHL)-we define the function space as an infinite union of RKHSs, which generalizes the existing Barron space theory of two-layer NNs. We then establish several theoretical properties of the new space. First, we prove a correspondence between functions expressed by L-layer NNs and those belonging to L-level NHLs. Second, we prove generalization guarantees for learning an NHL with a controlled complexity measure. Third, we derive a non-Markovian dynamics of random fields that governs the evolution of the NHL which is induced by the training of multi-layer NNs in an infinite-width mean-field limit. Fourth, we show examples of depth separation in NHLs under the ReLU activation function. Finally, we perform numerical experiments to illustrate the feature learning aspect of NN training through the lens of NHLs.
引用
收藏
页数:65
相关论文
共 50 条
  • [1] Multi-Layer Neural Networks as Trainable Ladders of Hilbert Spaces
    Chen, Zhengdao
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [2] Evaluation function for fault tolerant multi-layer neural networks
    Takase, H
    Shinogi, T
    Hayashi, T
    Kita, H
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL III, 2000, : 521 - 526
  • [3] The layer effect on multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    APPLIED MATHEMATICS LETTERS, 2013, 26 (07) : 706 - 709
  • [4] On the structure of multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    Lin, Song-Sun
    JOURNAL OF DIFFERENTIAL EQUATIONS, 2012, 252 (08) : 4563 - 4597
  • [5] The learning problem of multi-layer neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    NEURAL NETWORKS, 2013, 46 : 116 - 123
  • [6] Diamond in multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    APPLIED MATHEMATICS AND COMPUTATION, 2013, 222 : 1 - 12
  • [7] Evolving Multi-Layer Neural Networks for Othello
    Makris, Vassilis
    Kalles, Dimitris
    9TH HELLENIC CONFERENCE ON ARTIFICIAL INTELLIGENCE (SETN 2016), 2016,
  • [8] MULTI-LAYER NEURAL NETWORKS FOR SALES FORECASTING
    Scherer, Magdalena
    JOURNAL OF APPLIED MATHEMATICS AND COMPUTATIONAL MECHANICS, 2018, 17 (01) : 61 - 68
  • [9] Reinforcement of extrapolation of multi-layer neural networks
    Aoyama, T
    Wang, QY
    Nagashima, U
    Yoshihara, I
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 2495 - 2500
  • [10] Activation Function Integration for Accelerating Multi-Layer Graph Convolutional Neural Networks
    Grailoo, Mahdieh
    Nikoubin, Tooraj
    Gustafsson, Oscar
    Nunez-Yanez, Jose
    17TH IEEE DALLAS CIRCUITS AND SYSTEMS CONFERENCE, DCAS 2024, 2024,