Effective approximation of high-dimensional space using neural networks

被引:7
|
作者
Zheng, Jian [1 ]
Wang, Jianfeng [1 ]
Chen, Yanping [1 ]
Chen, Shuping [1 ]
Chen, Jingjin [1 ]
Zhong, Wenlong [1 ]
Wu, Wenling [1 ]
机构
[1] Chongqing Aerosp Polytech, Chongqing 40021, Peoples R China
来源
JOURNAL OF SUPERCOMPUTING | 2022年 / 78卷 / 03期
关键词
High-dimensional function; High-dimensional space; Neural networks;
D O I
10.1007/s11227-021-04038-2
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Because of the curse of dimensionality, the data in high-dimensional space hardly afford sufficient information for neural networks training. Hence, this is a tough task to approximate the high-dimensional space using neural networks. To address this, here proposes the method that neural networks approximate a high-dimensional function that can effectively approach the high-dimensional space, rather than using neural networks to directly approximate the high-dimensional space. Hence, two boundaries were derived by Lipschitz condition, i.e., the one is that neural networks approximate a high-dimensional function, and the other is that a high-dimensional function approaches the high-dimensional space. Experimental results on synthetic and real-world datasets show that our method is effective and outperforms the competing methods in the performance to approximate the high-dimensional space. We find that this manner of using neural networks to approximate a high-dimensional function that can effectively approach the high-dimensional space is more resistance to the curse of dimensionality. In addition, the ability of the proposed method to approximate the high-dimensional space is related to the number of hidden layers and the choice of high-dimensional functions, but more relies on the latter. Our findings demonstrate that it is no obvious dependency between the number of hidden layers respecting the proposed method and the choice for high-dimensional functions.
引用
收藏
页码:4377 / 4397
页数:21
相关论文
共 50 条
  • [31] Siamese neural networks for the classification of high-dimensional radiomic features
    Mahajan, Abhishaike
    Dormer, James
    Li, Qinmei
    Chen, Deji
    Zhang, Zhenfeng
    Fei, Baowei
    MEDICAL IMAGING 2020: COMPUTER-AIDED DIAGNOSIS, 2020, 11314
  • [32] Functional Neural Networks for High-Dimensional Genetic Data Analysis
    Zhang, Shan
    Zhou, Yuan
    Geng, Pei
    Lu, Qing
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2024, 21 (03) : 383 - 393
  • [33] Robust high-dimensional memory-augmented neural networks
    Geethan Karunaratne
    Manuel Schmuck
    Manuel Le Gallo
    Giovanni Cherubini
    Luca Benini
    Abu Sebastian
    Abbas Rahimi
    Nature Communications, 12
  • [34] High-dimensional distribution generation through deep neural networks
    Perekrestenko, Dmytro
    Eberhard, Leandre
    Bolcskei, Helmut
    PARTIAL DIFFERENTIAL EQUATIONS AND APPLICATIONS, 2021, 2 (05):
  • [35] Robust high-dimensional memory-augmented neural networks
    Karunaratne, Geethan
    Schmuck, Manuel
    Le Gallo, Manuel
    Cherubini, Giovanni
    Benini, Luca
    Sebastian, Abu
    Rahimi, Abbas
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [36] Construction of high-dimensional neural networks by linear connections of matrices
    Kobayashi, M
    Muramatsu, J
    Yamazaki, H
    ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE, 2003, 86 (11): : 38 - 45
  • [37] An indexing technique using relative approximation for high-dimensional data
    Sakurai, Y., 1600, John Wiley and Sons Inc. (34):
  • [38] High-Dimensional Function Approximation Using Local Linear Embedding
    Andras, Peter
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [39] Interpretable Approximation of High-Dimensional Data
    Potts, Daniel
    Schmischke, Michael
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2021, 3 (04): : 1301 - 1323
  • [40] Approximation of high-dimensional parametric PDEs
    Cohen, Albert
    DeVore, Ronald
    ACTA NUMERICA, 2015, 24 : 1 - 159