Deep Learing;
Deep Neural Networks;
Number Representation;
D O I:
暂无
中图分类号:
TP3 [计算技术、计算机技术];
学科分类号:
0812 ;
摘要:
Practical deep neural networks have a number of weight parameters, and the dynamic fixed-point formats have been used to represent them efficiently. The dynamic fixed-point representations share an scaling factor among a group of numbers, and the weights in a layer have been formed into such a group. In this paper, we first explore a design space for dynamic fixed-point neuromorphic computing systems and show that it is indispensable to have a small group size in neuromorphic architectures, because it is appropriate to group the weights associated with a neuron into a group. We then presents a dynamic fixed-point representation designed for neuromorphic computing systems. Our experimental results show that the predictive performance of recent deep neural networks such as AlexNet can be improved significantly with the proposed representation over the traditional fixed point format.