Effective approximation of high-dimensional space using neural networks

被引:7
|
作者
Zheng, Jian [1 ]
Wang, Jianfeng [1 ]
Chen, Yanping [1 ]
Chen, Shuping [1 ]
Chen, Jingjin [1 ]
Zhong, Wenlong [1 ]
Wu, Wenling [1 ]
机构
[1] Chongqing Aerosp Polytech, Chongqing 40021, Peoples R China
来源
JOURNAL OF SUPERCOMPUTING | 2022年 / 78卷 / 03期
关键词
High-dimensional function; High-dimensional space; Neural networks;
D O I
10.1007/s11227-021-04038-2
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Because of the curse of dimensionality, the data in high-dimensional space hardly afford sufficient information for neural networks training. Hence, this is a tough task to approximate the high-dimensional space using neural networks. To address this, here proposes the method that neural networks approximate a high-dimensional function that can effectively approach the high-dimensional space, rather than using neural networks to directly approximate the high-dimensional space. Hence, two boundaries were derived by Lipschitz condition, i.e., the one is that neural networks approximate a high-dimensional function, and the other is that a high-dimensional function approaches the high-dimensional space. Experimental results on synthetic and real-world datasets show that our method is effective and outperforms the competing methods in the performance to approximate the high-dimensional space. We find that this manner of using neural networks to approximate a high-dimensional function that can effectively approach the high-dimensional space is more resistance to the curse of dimensionality. In addition, the ability of the proposed method to approximate the high-dimensional space is related to the number of hidden layers and the choice of high-dimensional functions, but more relies on the latter. Our findings demonstrate that it is no obvious dependency between the number of hidden layers respecting the proposed method and the choice for high-dimensional functions.
引用
收藏
页码:4377 / 4397
页数:21
相关论文
共 50 条
  • [21] Multi-classification for high-dimensional data using probabilistic neural networks
    Li, Jingyi
    Chao, Xiaojie
    Xu, Qin
    JOURNAL OF RADIATION RESEARCH AND APPLIED SCIENCES, 2022, 15 (02) : 111 - 118
  • [22] High-dimensional dynamics of generalization error in neural networks
    Advani, Madhu S.
    Saxe, Andrew M.
    Sompolinsky, Haim
    NEURAL NETWORKS, 2020, 132 : 428 - 446
  • [23] Granger causality detection in high-dimensional systems using feedforward neural networks
    Calvo-Pardo, Hector
    Mancini, Tullio
    Olmo, Jose
    INTERNATIONAL JOURNAL OF FORECASTING, 2021, 37 (02) : 920 - 940
  • [24] Multifidelity Prediction Framework with Convolutional Neural Networks Using High-Dimensional Data
    Emre Tekaslan, Huseyin
    Nikbay, Melike
    JOURNAL OF AEROSPACE INFORMATION SYSTEMS, 2023, 20 (05): : 264 - 275
  • [25] Effective indexing and searching with dimensionality reduction in high-dimensional space
    Jeong, Seungdo
    Kim, Sang-Wook
    Choi, Byung-Uk
    COMPUTER SYSTEMS SCIENCE AND ENGINEERING, 2016, 31 (04): : 291 - 302
  • [26] An effective method for approximating the Euclidean distance in high-dimensional space
    Jeong, Seungdo
    Kim, Sang-Wook
    Kim, Kidong
    Choi, Byung-Uk
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, PROCEEDINGS, 2006, 4080 : 863 - 872
  • [27] Indoor human activity recognition using high-dimensional sensors and deep neural networks
    Vandersmissen, Baptist
    Knudde, Nicolas
    Jalalvand, Azarakhsh
    Couckuyt, Ivo
    Dhaene, Tom
    De Neve, Wesley
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (16): : 12295 - 12309
  • [28] Comparison of high-dimensional neural networks using hypercomplex numbers in a robot manipulator control
    Takahashi, Kazuhiko
    ARTIFICIAL LIFE AND ROBOTICS, 2021, 26 (03) : 367 - 377
  • [29] Comparison of high-dimensional neural networks using hypercomplex numbers in a robot manipulator control
    Kazuhiko Takahashi
    Artificial Life and Robotics, 2021, 26 : 367 - 377
  • [30] Indoor human activity recognition using high-dimensional sensors and deep neural networks
    Baptist Vandersmissen
    Nicolas Knudde
    Azarakhsh Jalalvand
    Ivo Couckuyt
    Tom Dhaene
    Wesley De Neve
    Neural Computing and Applications, 2020, 32 : 12295 - 12309