On the approximation of rough functions with deep neural networks

被引:0
|
作者
De Ryck T. [1 ]
Mishra S. [1 ]
Ray D. [2 ]
机构
[1] Seminar for Applied Mathematics, ETH Zürich, Rämistrasse 101, Zurich
[2] University of Southern California, Los Angeles
基金
欧盟地平线“2020”;
关键词
Data compression; Deep ReLU networks; ENO; Interpolation; Rough functions;
D O I
10.1007/s40324-022-00299-w
中图分类号
学科分类号
摘要
The essentially non-oscillatory (ENO) procedure and its variant, the ENO-SR procedure, are very efficient algorithms for interpolating (reconstructing) rough functions. We prove that the ENO (and ENO-SR) procedure are equivalent to deep ReLU neural networks. This demonstrates the ability of deep ReLU neural networks to approximate rough functions to high-order of accuracy. Numerical tests for the resulting trained neural networks show excellent performance for interpolating functions, approximating solutions of nonlinear conservation laws and at data compression. © 2022, The Author(s).
引用
收藏
页码:399 / 440
页数:41
相关论文
共 50 条