A comparison of quaternion neural network backpropagation algorithms

被引:2
|
作者
Bill, Jeremiah [1 ]
Cox, Bruce A. [1 ]
Champagne, Lance [1 ]
机构
[1] Air Force Inst Technol, Dept Operat Sci, 2950 Hobson Way, Wright Patterson AFB, OH 45433 USA
关键词
Clifford algebra; Clifford neural networks; Design of experiments; Geometric algebra; GHR calculus; Hypercomplex numbers; Latin hypercube sampling; Quaternion neural networks; Space filling designs; CLASSIFICATION; DESCENT;
D O I
10.1016/j.eswa.2023.120448
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This research paper focuses on quaternion neural networks (QNNs) -a type of neural network wherein the weights, biases, and input values are all represented as quaternion numbers. Previous studies have shown that QNNs outperform real-valued neural networks in basic tasks and have potential in high-dimensional problem spaces. However, research on QNNs has been fragmented, with contributions from different mathematical and engineering domains leading to unintentional overlap in QNN literature. This work aims to unify existing research by evaluating four distinct QNN backpropagation algorithms, including the novel GHR-calculus backpropagation algorithm, and providing concise, scalable implementations of each algorithm using a modern compiled programming language. Additionally, the authors apply a robust Design of Experiments (DoE) methodology to compare the accuracy and runtime of each algorithm. The experiments demonstrate that the Clifford Multilayer Perceptron (CMLP) learning algorithm results in statistically significant improvements in network test set accuracy while maintaining comparable runtime performance to the other three algorithms in four distinct regression tasks. By unifying existing research and comparing different QNN training algorithms, this work develops a state-of-the-art baseline and provides important insights into the potential of QNNs for solving high-dimensional problems.
引用
收藏
页数:18
相关论文
共 50 条