The universal approximation theorem for complex-valued neural networks

被引:17
|
作者
Voigtlaender, Felix [1 ,2 ,3 ]
机构
[1] Tech Univ Munich, Dept Math, D-85748 Garching, Germany
[2] Univ Vienna, Fac Math, Oskar Morgenstern Pl 1, A-1090 Vienna, Austria
[3] Catholic Univ Eichstatt Ingolstadt KU, Math Inst Machine Learning & Data Sci MIDS, Schanz 49, D-85049 Ingolstadt, Germany
关键词
Complex-valued neural networks; Universal approximation theorem; Deep neural networks; Polyharmonic functions; Holomorphic functions; MULTILAYER FEEDFORWARD NETWORKS; SMOOTH;
D O I
10.1016/j.acha.2022.12.002
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We generalize the classical universal approximation theorem for neural networks to the case of complex-valued neural networks. Precisely, we consider feedforward networks with a complex activation function Sigma : C-+ C in which each neuron per -forms the operation CN-+ C, z-+ Sigma(b +wT z) with weights w E CN and a bias b E C. We completely characterize those activation functions Sigma for which the asso-ciated complex networks have the universal approximation property, meaning that they can uniformly approximate any continuous function on any compact subset of Cd arbitrarily well. Unlike the classical case of real networks, the set of "good activation functions"-which give rise to networks with the universal approximation property-differs significantly depending on whether one considers deep networks or shallow networks: For deep networks with at least two hidden layers, the universal approximation property holds as long as Sigma is neither a polynomial, a holomorphic function, nor an antiholomorphic function. Shallow networks, on the other hand, are universal if and only if the real part or the imaginary part of Sigma is not a polyharmonic function.(c) 2022 Elsevier Inc. All rights reserved.
引用
收藏
页码:33 / 61
页数:29
相关论文
共 50 条
  • [1] Optimal approximation using complex-valued neural networks
    Geuchen, Paul
    Voigtlaender, Felix
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] Quantitative Approximation Results for Complex-Valued Neural Networks
    Caragea, Andrei
    Lee, Dae Gwan
    Maly, Johannes
    Pfander, Goetz
    Voigtlaender, Felix
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (02): : 553 - 580
  • [3] Complex-Valued Logic for Neural Networks
    Kagan, Evgeny
    Rybalov, Alexander
    Yager, Ronald
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON THE SCIENCE OF ELECTRICAL ENGINEERING IN ISRAEL (ICSEE), 2018,
  • [4] Universal approximation theorem for vector- and hypercomplex-valued neural networks
    Valle, Marcos Eduardo
    Vital, Wington L.
    Vieira, Guilherme
    [J]. NEURAL NETWORKS, 2024, 180
  • [5] The uniqueness theorem for complex-valued neural networks with threshold parameters and the redundancy of the parameters
    Nitta, Tohru
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2008, 18 (02) : 123 - 134
  • [6] Adaptive complex-valued stepsize based fast learning of complex-valued neural networks
    Zhang, Yongliang
    Huang, He
    [J]. NEURAL NETWORKS, 2020, 124 : 233 - 242
  • [7] Complex-Valued Recurrent Correlation Neural Networks
    Valle, Marcos Eduardo
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (09) : 1600 - 1612
  • [8] Entanglement Detection with Complex-Valued Neural Networks
    Yue-Di Qu
    Rui-Qi Zhang
    Shu-Qian Shen
    Juan Yu
    Ming Li
    [J]. International Journal of Theoretical Physics, 62
  • [9] Complex-Valued Neural Networks:A Comprehensive Survey
    ChiYan Lee
    Hideyuki Hasegawa
    Shangce Gao
    [J]. IEEE/CAA Journal of Automatica Sinica, 2022, 9 (08) : 1406 - 1426
  • [10] Complex-Valued Neural Networks for Noncoherent Demodulation
    Gorday, Paul E.
    Erdol, Nurgun
    Zhuang, Hanqi
    [J]. IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2020, 1 : 217 - 225