Adaptive orthogonal gradient descent algorithm for fully complex-valued neural networks

被引:2
|
作者
Zhao, Weijing [1 ]
Huang, He [1 ]
机构
[1] Soochow Univ, Sch Elect & Informat Engn, Suzhou 215006, Peoples R China
关键词
Fully complex -valued neural networks; Adaptive complex -valued stepsize; Orthogonal directions; Decoupling design; Gradient descent; OPERATOR;
D O I
10.1016/j.neucom.2023.126358
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
For optimization algorithms of fully complex-valued neural networks, complex-valued stepsize is helpful to make the training escape from saddle points. In this paper, an adaptive orthogonal gradient descent algorithm with complex-valued stepsize is proposed for the efficient training of fully complex-valued neural networks. The basic idea is that, at each iteration, the search direction is constructed as a combi-nation of two orthogonal gradient directions by using the algebraic representation of complex-valued stepsize. It is then shown that the determination of suitable complex-valued stepsize is facilitated by a decoupling method such that the computational complexity involved in the training process is greatly reduced. The experiments are finally conducted on pattern classification, nonlinear channel equalization and signal prediction to confirm the advantages of the proposed algorithm.CO 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:8
相关论文
共 50 条