High-dimensional distribution generation through deep neural networks

被引:3
|
作者
Perekrestenko, Dmytro [1 ]
Eberhard, Leandre [2 ]
Bolcskei, Helmut [3 ]
机构
[1] Ablacon Inc, Zurich, Switzerland
[2] Upstart Network Inc, Columbus, OH USA
[3] Swiss Fed Inst Technol, Zurich, Switzerland
来源
关键词
Deep learning; Neural networks; Generative networks; Space-filling curves; Quantization; Approximation theory; APPROXIMATION;
D O I
10.1007/s42985-021-00115-6
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We show that every d-dimensional probability distribution of bounded support can be generated through deep ReLU networks out of a 1-dimensional uniform input distribution. What is more, this is possible without incurring a cost-in terms of approximation error measured in Wasserstein-distance-relative to generating the d-dimensional target distribution from d independent random variables. This is enabled by a vast generalization of the space-filling approach discovered in Bailey and Telgarsky (in: Bengio (eds) Advances in neural information processing systems vol 31, pp 6489-6499. Curran Associates, Inc., Red Hook, 2018). The construction we propose elicits the importance of network depth in driving the Wasserstein distance between the target distribution and its neural network approximation to zero. Finally, we find that, for histogram target distributions, the number of bits needed to encode the corresponding generative network equals the fundamental limit for encoding probability distributions as dictated by quantization theory.
引用
收藏
页数:44
相关论文
共 50 条
  • [41] Unsupervised Artificial Neural Networks for Outlier Detection in High-Dimensional Data
    Popovic, Daniel
    Fouche, Edouard
    Boehm, Klemens
    ADVANCES IN DATABASES AND INFORMATION SYSTEMS, ADBIS 2019, 2019, 11695 : 3 - 19
  • [42] Training neural networks on high-dimensional data using random projection
    Piotr Iwo Wójcik
    Marcin Kurdziel
    Pattern Analysis and Applications, 2019, 22 : 1221 - 1231
  • [43] Deep neural networks based temporal-difference methods for high-dimensional parabolic partial differential equations
    Zeng, Shaojie
    Cai, Yihua
    Zou, Qingsong
    Journal of Computational Physics, 2022, 468
  • [44] Ultrahigh-fidelity spatial mode quantum gates in high-dimensional space by diffractive deep neural networks
    Qianke Wang
    Jun Liu
    Dawei Lyu
    Jian Wang
    Light: Science & Applications, 13
  • [45] ROBUST CLASSIFICATION OF HIGH-DIMENSIONAL DATA USING ARTIFICIAL NEURAL NETWORKS
    SMITH, DJ
    BAILEY, TC
    MUNFORD, AG
    STATISTICS AND COMPUTING, 1993, 3 (02) : 71 - 81
  • [46] Training High-Dimensional Neural Networks with Cooperative Particle Swarm Optimiser
    Rakitianskaia, Anna
    Engelbrecht, Andries
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 4011 - 4018
  • [47] Separable Gaussian neural networks for high-dimensional nonlinear stochastic systems
    Wang, Xi
    Xing, Siyuan
    Jiang, Jun
    Hong, Ling
    Sun, Jian-Qiao
    PROBABILISTIC ENGINEERING MECHANICS, 2024, 76
  • [48] Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data
    Adcock, Ben
    Brugiapaglia, Simone
    Dexter, Nick
    Moraga, Sebastian
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 145, 2021, 145 : 1 - 36
  • [49] Deep neural networks based temporal-difference methods for high-dimensional parabolic partial differential equations
    Zeng, Shaojie
    Cai, Yihua
    Zou, Qingsong
    JOURNAL OF COMPUTATIONAL PHYSICS, 2022, 468
  • [50] Deep neural networks can stably solve high-dimensional, noisy, non-linear inverse problems
    Pineda, Andres Felipe Lerma
    Petersen, Philipp Christian
    ANALYSIS AND APPLICATIONS, 2023, 21 (01) : 49 - 91