Optimal approximation using complex-valued neural networks

被引:0
|
作者
Geuchen, Paul [1 ]
Voigtlaender, Felix [1 ]
机构
[1] KU Eichstatt Ingolstadt, MIDS, Schanz 49, D-85049 Ingolstadt, Germany
关键词
BOUNDS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Complex-valued neural networks (CVNNs) have recently shown promising empirical success, for instance for increasing the stability of recurrent neural networks and for improving the performance in tasks with complex-valued inputs, such as in MRI fingerprinting. While the overwhelming success of Deep Learning in the real-valued case is supported by a growing mathematical foundation, such a foundation is still largely lacking in the complex-valued case. We thus analyze the expressivity of CVNNs by studying their approximation properties. Our results yield the first quantitative approximation bounds for CVNNs that apply to a wide class of activation functions including the popular modReLU and complex cardioid activation functions. Precisely, our results apply to any activation function that is smooth but not polyharmonic on some non-empty open set; this is the natural generalization of the class of smooth and non-polynomial activation functions to the complex setting. Our main result shows that the error for the approximation of C-k-functions scales as m(-k/(2n)) for m -> infinity where m is the number of neurons, k the smoothness of the target function and n is the (complex) input dimension. Under a natural continuity assumption, we show that this rate is optimal; we further discuss the optimality when dropping this assumption. Moreover, we prove that the problem of approximating C-k-functions using continuous approximation methods unavoidably suffers from the curse of dimensionality.
引用
收藏
页数:57
相关论文
共 50 条
  • [1] Quantitative Approximation Results for Complex-Valued Neural Networks
    Caragea, Andrei
    Lee, Dae Gwan
    Maly, Johannes
    Pfander, Goetz
    Voigtlaender, Felix
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (02): : 553 - 580
  • [2] The universal approximation theorem for complex-valued neural networks
    Voigtlaender, Felix
    [J]. APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 64 : 33 - 61
  • [3] Complex-Valued Logic for Neural Networks
    Kagan, Evgeny
    Rybalov, Alexander
    Yager, Ronald
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON THE SCIENCE OF ELECTRICAL ENGINEERING IN ISRAEL (ICSEE), 2018,
  • [4] Improving Gradient Regularization using Complex-Valued Neural Networks
    Yeats, Eric
    Chen, Yiran
    Li, Hai
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Adaptive complex-valued stepsize based fast learning of complex-valued neural networks
    Zhang, Yongliang
    Huang, He
    [J]. NEURAL NETWORKS, 2020, 124 : 233 - 242
  • [6] Complex-Valued Recurrent Correlation Neural Networks
    Valle, Marcos Eduardo
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (09) : 1600 - 1612
  • [7] Entanglement Detection with Complex-Valued Neural Networks
    Yue-Di Qu
    Rui-Qi Zhang
    Shu-Qian Shen
    Juan Yu
    Ming Li
    [J]. International Journal of Theoretical Physics, 62
  • [8] Complex-Valued Neural Networks:A Comprehensive Survey
    ChiYan Lee
    Hideyuki Hasegawa
    Shangce Gao
    [J]. IEEE/CAA Journal of Automatica Sinica, 2022, 9 (08) : 1406 - 1426
  • [9] Complex-valued neural networks: The merits and their origins
    Hirose, Akira
    [J]. IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 1209 - 1216
  • [10] Network inversion for complex-valued neural networks
    Ogawa, T
    Kanada, H
    [J]. 2005 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Vols 1 and 2, 2005, : 850 - 855