High-dimensional distribution generation through deep neural networks

被引:3
|
作者
Perekrestenko, Dmytro [1 ]
Eberhard, Leandre [2 ]
Bolcskei, Helmut [3 ]
机构
[1] Ablacon Inc, Zurich, Switzerland
[2] Upstart Network Inc, Columbus, OH USA
[3] Swiss Fed Inst Technol, Zurich, Switzerland
来源
关键词
Deep learning; Neural networks; Generative networks; Space-filling curves; Quantization; Approximation theory; APPROXIMATION;
D O I
10.1007/s42985-021-00115-6
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We show that every d-dimensional probability distribution of bounded support can be generated through deep ReLU networks out of a 1-dimensional uniform input distribution. What is more, this is possible without incurring a cost-in terms of approximation error measured in Wasserstein-distance-relative to generating the d-dimensional target distribution from d independent random variables. This is enabled by a vast generalization of the space-filling approach discovered in Bailey and Telgarsky (in: Bengio (eds) Advances in neural information processing systems vol 31, pp 6489-6499. Curran Associates, Inc., Red Hook, 2018). The construction we propose elicits the importance of network depth in driving the Wasserstein distance between the target distribution and its neural network approximation to zero. Finally, we find that, for histogram target distributions, the number of bits needed to encode the corresponding generative network equals the fundamental limit for encoding probability distributions as dictated by quantization theory.
引用
收藏
页数:44
相关论文
共 50 条
  • [1] Deep ReLU neural networks in high-dimensional approximation
    Dung, Dinh
    Nguyen, Van Kien
    NEURAL NETWORKS, 2021, 142 : 619 - 635
  • [2] Distributed Learning of Deep Sparse Neural Networks for High-dimensional Classification
    Garg, Shweta
    Krishnan, R.
    Jagannathan, S.
    Samaranayake, V. A.
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 1587 - 1592
  • [3] Minimax optimal high-dimensional classification using deep neural networks
    Wang, Shuoyang
    Shang, Zuofeng
    STAT, 2022, 11 (01):
  • [4] "Lossless" Compression of Deep Neural Networks: A High-dimensional Neural Tangent Kernel Approach
    Gu, Lingyu
    Du, Yongqi
    Zhang, Yuan
    Xie, Di
    Pu, Shiliang
    Qiu, Robert C.
    Liao, Zhenyu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [5] Application of deep neural networks for high-dimensional large BWR core neutronics
    Abu Saleem, Rabie
    Radaideh, Majdi, I
    Kozlowski, Tomasz
    NUCLEAR ENGINEERING AND TECHNOLOGY, 2020, 52 (12) : 2709 - 2716
  • [6] Adaptive deep neural networks methods for high-dimensional partial differential equations
    Zeng, Shaojie
    Zhang, Zong
    Zou, Qingsong
    JOURNAL OF COMPUTATIONAL PHYSICS, 2022, 463
  • [7] Chaos in high-dimensional neural and gene networks
    Mestl, T
    Lemay, C
    Glass, L
    PHYSICA D-NONLINEAR PHENOMENA, 1996, 98 (01) : 33 - 52
  • [8] Neural networks trained with high-dimensional functions approximation data in high-dimensional space
    Zheng, Jian
    Wang, Jianfeng
    Chen, Yanping
    Chen, Shuping
    Chen, Jingjin
    Zhong, Wenlong
    Wu, Wenling
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 41 (02) : 3739 - 3750
  • [9] Neural networks trained with high-dimensional functions approximation data in high-dimensional space
    Zheng, Jian
    Wang, Jianfeng
    Chen, Yanping
    Chen, Shuping
    Chen, Jingjin
    Zhong, Wenlong
    Wu, Wenling
    Journal of Intelligent and Fuzzy Systems, 2021, 41 (02): : 3739 - 3750
  • [10] Indoor human activity recognition using high-dimensional sensors and deep neural networks
    Vandersmissen, Baptist
    Knudde, Nicolas
    Jalalvand, Azarakhsh
    Couckuyt, Ivo
    Dhaene, Tom
    De Neve, Wesley
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (16): : 12295 - 12309