Extending the Universal Approximation Theorem for a Broad Class of Hypercomplex-Valued Neural Networks

被引:3
|
作者
Vital, Wington L. [1 ]
Vieira, Guilherme [1 ]
Valle, Marcos Eduardo [1 ]
机构
[1] Univ Estadual Campinas, Campinas, Brazil
来源
INTELLIGENT SYSTEMS, PT II | 2022年 / 13654卷
基金
巴西圣保罗研究基金会;
关键词
Hypercomplex algebras; Neural networks; Universal approximation theorem;
D O I
10.1007/978-3-031-21689-3_45
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The universal approximation theorem asserts that a single hidden layer neural network approximates continuous functions with any desired precision on compact sets. As an existential result, the universal approximation theorem supports the use of neural networks for various applications, including regression and classification tasks. The universal approximation theorem is not limited to real-valued neural networks but also holds for complex, quaternion, tessarines, and Clifford-valued neural networks. This paper extends the universal approximation theorem for a broad class of hypercomplex-valued neural networks. Precisely, we first introduce the concept of non-degenerate hypercomplex algebra. Complex numbers, quaternions, and tessarines are examples of non-degenerate hypercomplex algebras. Then, we state the universal approximation theorem for hypercomplex-valued neural networks defined on a non-degenerate algebra.
引用
收藏
页码:646 / 660
页数:15
相关论文
共 50 条
  • [1] Universal approximation theorem for vector- and hypercomplex-valued neural networks
    Valle, Marcos Eduardo
    Vital, Wington L.
    Vieira, Guilherme
    [J]. NEURAL NETWORKS, 2024, 180
  • [2] A broad class of discrete-time hypercomplex-valued Hopfield neural networks
    de Castro, Fidelis Zanetti
    Valle, Marcos Eduardo
    [J]. NEURAL NETWORKS, 2020, 122 : 54 - 67
  • [3] Hypercomplex-valued recurrent correlation neural networks
    Valle, Marcos Eduardo
    Lobo, Rodolfo Anibal
    [J]. NEUROCOMPUTING, 2021, 432 : 111 - 123
  • [4] The universal approximation theorem for complex-valued neural networks
    Voigtlaender, Felix
    [J]. APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 64 : 33 - 61
  • [5] Understanding Vector-Valued Neural Networks and Their Relationship With Real and Hypercomplex-Valued Neural Networks: Incorporating intercorrelation between features into neural networks
    Valle, Marcos Eduardo
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2024, 41 (03) : 49 - 58
  • [6] Acute Lymphoblastic Leukemia Detection Using Hypercomplex-Valued Convolutional Neural Networks
    Vieira, Guilherme
    Valle, Marcos Eduardo
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [7] Guest Editorial Special Issue on Complex- and Hypercomplex-Valued Neural Networks
    Hirose, Akira
    Aizenberg, Igor
    Mandic, Danilo P.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (09) : 1597 - 1599
  • [8] Remarks on Adaptive-type Hypercomplex-valued Neural Network-based Feedforward Feedback Controller
    Takahashi, Kazuhiko
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY (CIT), 2017, : 151 - 156
  • [9] On the universal approximation theorem of fuzzy neural networks with random membership function parameters
    Wang, LP
    Liu, B
    Wan, CR
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 45 - 50
  • [10] Interval Universal Approximation for Neural Networks
    Wang, Zi
    Albarghouthi, A. W. S.
    Prakriya, Gautam
    Jha, Somesh
    [J]. PROCEEDINGS OF THE ACM ON PROGRAMMING LANGUAGES-PACMPL, 2022, 6