共 36 条
On derivation of stagewise second-order backpropagation by invariant imbedding for multi-stage neural-network learning
被引:0
|作者:
Mizutani, Eiji
[1
]
Dreyfus, Stuart
[2
]
机构:
[1] Natl Tsing Hua Univ, Dept Comp Sci, Hsinchu 300, Taiwan
[2] Univ Calif Berkeley, Dept Ind Engn & Operat Res, Berkeley, CA 94720 USA
关键词:
D O I:
暂无
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
We present a simple, intuitive argument based on "invariant imbedding" in the spirit of dynamic programming to derive a stagewise second-order backpropagation (BP) algorithm. The method evaluates the Hessian matrix of a general objective function efficiently by exploiting the multi-stage structure embedded in a given neural-network model such as a multilayer perceptron (MLP). In consequence, for instance, our stagewise BP can compute the full Hessian matrix "faster" than the standard method that evaluates the Gauss-Newton Hessian matrix alone by rank updates in nonlinear least squares learning. Through our derivation, we also show how the procedure serves to develop advanced learning algorithms; in particular, we explain how the introduction of "stage costs" leads to alternative systematic implementations of multi-task learning and weight decay.
引用
收藏
页码:4762 / +
页数:2
相关论文