Double-exponential sigmoidal functions for neural networksDoppelexponentielle Sigmoidfunktionen für Neuronale Netze

被引:0
|
作者
M. Heiss
S. Kampl
机构
关键词
neural network; activation function; sigmoid, no multiplication; double-exponential function; numeric simplicity; Neuronal Netze; Transferfunktion; Sigmoidfunktion; multiplikationsfrei; doppel-exponentiell; Recheneffizienz;
D O I
10.1007/BF03159051
中图分类号
学科分类号
摘要
A computationally efficient sigmoidal activation function is presented, called a double-exponential signal function, and the properties are compared with other signal functions. The sigmoidal function is monotonously increasing, continuous in all derivaties, and its output is 0.5 for zero input. The weight multiplication can be replaced by an addition when the training of the network is performed offline. We also present an approximation of this signal function, called a polygonal signal function, reducing the computational effort solely to bit sets and shift operations.
引用
收藏
页码:360 / 363
页数:3
相关论文
共 11 条