Understanding Vector-Valued Neural Networks and Their Relationship With Real and Hypercomplex-Valued Neural Networks: Incorporating intercorrelation between features into neural networks

被引:0
|
作者
Valle, Marcos Eduardo [1 ]
机构
[1] Univ Estadual Campinas, Dept Appl Math, BR-13083859 Campinas, Brazil
基金
巴西圣保罗研究基金会;
关键词
Training data; Deep learning; Image processing; Neural networks; Parallel processing; Vectors; Hypercomplex; Multidimensional signal processing;
D O I
10.1109/MSP.2024.3401621
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Despite the many successful applications of deep learning models for multidimensional signal and image processing, most traditional neural networks process data represented by (multidimensional) arrays of real numbers. The intercorrelation between feature channels is usually expected to be learned from the training data, requiring numerous parameters and careful training. In contrast, vector-valued neural networks (referred to as V-nets) are conceived to process arrays of vectors and naturally consider the intercorrelation between feature channels. Consequently, they usually have fewer parameters and often undergo more robust training than traditional neural networks. This article aims to present a broad framework for V-nets. In this context, hypercomplex-valued neural networks are regarded as vector-valued models with additional algebraic properties. Furthermore, this article explains the relationship between vector-valued and traditional neural networks. To be precise, a V-net can be obtained by placing restrictions on a real-valued model to consider the intercorrelation between feature channels. Finally, I show how V-nets, including hypercomplex-valued neural networks, can be implemented in current deep learning libraries as real-valued networks.
引用
收藏
页码:49 / 58
页数:10
相关论文
共 50 条
  • [1] Hypercomplex-valued recurrent correlation neural networks
    Valle, Marcos Eduardo
    Lobo, Rodolfo Anibal
    [J]. NEUROCOMPUTING, 2021, 432 : 111 - 123
  • [2] Universal approximation theorem for vector- and hypercomplex-valued neural networks
    Valle, Marcos Eduardo
    Vital, Wington L.
    Vieira, Guilherme
    [J]. NEURAL NETWORKS, 2024, 180
  • [3] Extending the Universal Approximation Theorem for a Broad Class of Hypercomplex-Valued Neural Networks
    Vital, Wington L.
    Vieira, Guilherme
    Valle, Marcos Eduardo
    [J]. INTELLIGENT SYSTEMS, PT II, 2022, 13654 : 646 - 660
  • [4] A broad class of discrete-time hypercomplex-valued Hopfield neural networks
    de Castro, Fidelis Zanetti
    Valle, Marcos Eduardo
    [J]. NEURAL NETWORKS, 2020, 122 : 54 - 67
  • [5] Acute Lymphoblastic Leukemia Detection Using Hypercomplex-Valued Convolutional Neural Networks
    Vieira, Guilherme
    Valle, Marcos Eduardo
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [6] Guest Editorial Special Issue on Complex- and Hypercomplex-Valued Neural Networks
    Hirose, Akira
    Aizenberg, Igor
    Mandic, Danilo P.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (09) : 1597 - 1599
  • [7] Bridging the Gap Between Artificial Neural Networks and Kernel Regressions for Vector-Valued Problems in Microwave Applications
    Soleimani, Nastaran
    Trinchero, Riccardo
    Canavero, Flavio G.
    [J]. IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES, 2023, 71 (06) : 2319 - 2332
  • [8] Octonion-Valued Neural Networks
    Popa, Calin-Adrian
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I, 2016, 9886 : 435 - 443
  • [9] Matrix-Valued Neural Networks
    Popa, Calin-Adrian
    [J]. MENDEL 2015: RECENT ADVANCES IN SOFT COMPUTING, 2015, 378 : 245 - 255
  • [10] Dual-valued Neural Networks
    Kozlov, Dmitry
    Pavlov, Stanislav
    Zuev, Alexander
    Bakulin, Mikhail
    Krylova, Mariya
    Kharchikov, Igor
    [J]. 2022 18TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS 2022), 2022,