Approximation of Lipschitz Functions Using Deep Spline Neural Networks*

被引:7
|
作者
Neumayer, Sebastian [1 ]
Goujon, Alexis [1 ]
Bohra, Pakshal [1 ]
Unser, Michael [1 ]
机构
[1] Ecole Polytech Fed Lausanne EPFL, Biomed Imaging Grp, CH-1015 Lausanne, Switzerland
来源
基金
瑞士国家科学基金会; 欧洲研究理事会;
关键词
deep learning; learnable activations; universality; robustness; Lipschitz continuity; linear splines;
D O I
10.1137/22M1504573
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Although Lipschitz-constrained neural networks have many applications in machine learning, the design and training of expressive Lipschitz-constrained networks is very challenging. Since the popular rectified linear-unit networks have provable disadvantages in this setting, we propose using learnable spline activation functions with at least three linear regions instead. We prove that our choice is universal among all componentwise 1-Lipschitz activation functions in the sense that no other weight-constrained architecture can approximate a larger class of functions. Additionally, our choice is at least as expressive as the recently introduced non-componentwise Groupsort activation function for spectral-norm-constrained weights. The theoretical findings of this paper are consistent with previously published numerical results.
引用
收藏
页码:306 / 322
页数:17
相关论文
共 50 条
  • [1] Learning Activation Functions in Deep (Spline) Neural Networks
    Bohra, Pakshal
    Campos, Joaquim
    Gupta, Harshit
    Aziznejad, Shayan
    Unser, Michael
    [J]. IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 295 - 309
  • [2] On the approximation of rough functions with deep neural networks
    De Ryck T.
    Mishra S.
    Ray D.
    [J]. SeMA Journal, 2022, 79 (3) : 399 - 440
  • [3] DEEP SPLINE NETWORKS WITH CONTROL OF LIPSCHITZ REGULARITY
    Aziznejad, Shayan
    Unser, Michael
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3242 - 3246
  • [4] Cubic Spline Approximation of Transfer Functions for Speeding Neural Networks Performances
    Tsitoura, A. Ch
    Tsitouras, Ch
    [J]. INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS ICNAAM 2019, 2020, 2293
  • [5] Optimal approximation of piecewise smooth functions using deep ReLU neural networks
    Petersen, Philipp
    Voigtlaender, Felix
    [J]. NEURAL NETWORKS, 2018, 108 : 296 - 330
  • [6] Approximation capabilities of adaptive spline neural networks
    Vecci, L
    Campolucci, P
    Piazza, F
    Uncini, A
    [J]. 1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 260 - 265
  • [7] Approximation of functions from Korobov spaces by deep convolutional neural networks
    Tong Mao
    Ding-Xuan Zhou
    [J]. Advances in Computational Mathematics, 2022, 48
  • [8] Approximation of functions from Korobov spaces by deep convolutional neural networks
    Mao, Tong
    Zhou, Ding-Xuan
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2022, 48 (06)
  • [9] Smooth Function Approximation by Deep Neural Networks with General Activation Functions
    Ohn, Ilsang
    Kim, Yongdai
    [J]. ENTROPY, 2019, 21 (07)
  • [10] Placing spline knots in neural networks using splines as activation functions
    Hlavackova, K
    Verleysen, M
    [J]. NEUROCOMPUTING, 1997, 17 (3-4) : 159 - 166