Trainable Preprocessing for Reduced Precision Neural Networks

被引:0
|
作者
Csordas, Gabor [3 ]
Denolf, Kristof [1 ,2 ]
Fraser, Nicholas [1 ,2 ]
Pappalardo, Alessandro [1 ,2 ]
Vissers, Kees [1 ,2 ]
机构
[1] Xilinx Res Labs, Longmont, CO USA
[2] Xilinx Res Labs, Dublin, Ireland
[3] Ecole Polytech Fed Lausanne, Lausanne, Switzerland
来源
29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021) | 2021年
关键词
Data Preprocessing; Quantized Neural Networks;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Applications of neural networks are emerging in many fields and are frequently implemented in embedded environment, introducing power, throughput and latency constraints next to accuracy. Although practical computer vision solutions always involve some kind of preprocessing, most research focuses on the network itself. As a result, the preprocessing remains optimized for the human perception and is not tuned to neural networks. We propose the optimization of preprocesing along with the network using backpropagation and gradient descent. This open up the accuracy versus implementation cost design space towards more cost-efficient implementations by exploiting reduced precision input. In particular, we evaluate the effect of two preprocessing techniques: color conversion and dithering, using CIFAR10 and ImageNet datasets with different networks.
引用
收藏
页码:1546 / 1550
页数:5
相关论文
共 50 条
  • [21] Trainable Spectrally Initializable Matrix Transformations in Convolutional Neural Networks
    Alberti, Michele
    Botros, Angela
    Schuetz, Narayan
    Ingold, Rolf
    Liwicki, Marcus
    Seuret, Mathias
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8204 - 8211
  • [22] Playing the Lottery With Concave Regularizers for Sparse Trainable Neural Networks
    Fracastoro, Giulia
    Fosson, Sophie M.
    Migliorati, Andrea
    Calafiore, Giuseppe C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (03) : 4575 - 4585
  • [23] Playing the Lottery With Concave Regularizers for Sparse Trainable Neural Networks
    Fracastoro, Giulia
    Fosson, Sophie M.
    Migliorati, Andrea
    Calafiore, Giuseppe C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 11
  • [24] Trainable Delays in Time Delay Neural Networks for Learning Delayed Dynamics
    Ji, Xunbi A.
    Orosz, Gabor
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 11
  • [25] The effects of reduced precision bit lengths on feed forward neural networks for speech recognition
    Sen, S
    Robertson, W
    Phillips, WJ
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 1986 - 1991
  • [26] The investigation of image-preprocessing optoelectronic neural networks
    Ivanov, VA
    Mikaelian, AL
    Novoselov, BA
    Okonov, DE
    OPTICAL MEMORY AND NEURAL NETWORKS, 1998, 3402 : 318 - 324
  • [27] On neural networks-based preprocessing for speaker identification
    Tadj, C
    6TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL IX, PROCEEDINGS: IMAGE, ACOUSTIC, SPEECH AND SIGNAL PROCESSING II, 2002, : 373 - 377
  • [28] Traffic sign classification by image preprocessing and neural networks
    Vicen-Bueno, R.
    Garcia-Gonzalez, A.
    Torijano-Gordo, E.
    Gil-Pita, R.
    Rosa-Zurera, M.
    COMPUTATIONAL AND AMBIENT INTELLIGENCE, 2007, 4507 : 741 - +
  • [29] Neural networks for document image preprocessing: state of the art
    Rehman, Amjad
    Saba, Tanzila
    ARTIFICIAL INTELLIGENCE REVIEW, 2014, 42 (02) : 253 - 273
  • [30] Neural networks for document image preprocessing: state of the art
    Amjad Rehman
    Tanzila Saba
    Artificial Intelligence Review, 2014, 42 : 253 - 273