StressedNets: Efficient feature representations via stress-induced evolutionary synthesis of deep neural networks

被引:4
|
作者
Shafiee, Mohammad Javad [1 ]
Chwyl, Brendan [1 ]
Li, Francis [1 ]
Chen, Rongyan [2 ]
Karg, Michelle [2 ]
Scharfenberger, Christian [2 ]
Wong, Alexander [1 ]
机构
[1] Univ Waterloo, Dept Syst Design Engn, Waterloo, ON, Canada
[2] ADC Automot Distance Control Syst GmbH Continenta, Lindau, Germany
基金
加拿大自然科学与工程研究理事会;
关键词
Deep neural networks; Stress-induced evolutionary synthesis; Evolutionary synthesis;
D O I
10.1016/j.neucom.2019.03.028
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The computational complexity of leveraging deep neural networks for extracting deep feature representations is a significant barrier to its widespread adoption. This is particularly a bottleneck for use in embedded devices and application such as self-driving cars. One promising strategy to addressing the complexity issue is the notion of evolutionary synthesis of deep neural networks. It was demonstrated that it successfully produces highly efficient deep neural networks while retaining modeling performance. Here, we further extend upon the evolutionary synthesis strategy for achieving efficient feature extraction. A stress-induced evolutionary synthesis framework is proposed where the stress signals are imposed upon the synapses of a deep neural network during training step. This process induces stress and steers the synthesis process towards the production of more efficient deep neural networks over successive generations. As a result, it improves model fidelity at a greater efficiency. Applying stress during the training phase helps a network to adopt itself for the changes which would happen at the evolution step. The proposed stress-induced evolutionary synthesis approach is evaluated on a variety of different deep neural network architectures (LeNet5, AlexNet, and YOLOv2), different tasks (object classification and object detection) to synthesize efficient StressedNets over multiple generations. Experimental results demonstrate the efficacy of the proposed framework to synthesize StressedNets with significant improvement in network architecture efficiency (e.g., 40 x for AlexNet and 33 x for YOLOv2). It is also shown the speed improvements by the synthesized networks (e.g., 5.5 x inference speed-up for YOLOv2 on an Nvidia Tegra X1 mobile processor). (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:93 / 105
页数:13
相关论文
共 50 条
  • [31] Efficient generation of valid test inputs for deep neural networks via gradient search
    Jiang, Zhouxian
    Li, Honghui
    Wang, Rui
    JOURNAL OF SOFTWARE-EVOLUTION AND PROCESS, 2024, 36 (04)
  • [32] Spatial-Spectral Feature Extraction via Deep ConvLSTM Neural Networks for Hyperspectral Image Classification
    Hu, Wen-Shuai
    Li, Heng-Chao
    Pan, Lei
    Li, Wei
    Tao, Ran
    Du, Qian
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (06): : 4237 - 4250
  • [33] FPC: Filter pruning via the contribution of output feature map for deep convolutional neural networks acceleration
    Chen, Yanming
    Wen, Xiang
    Zhang, Yiwen
    He, Qiang
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [34] Robust and Efficient Phase Estimation in Legged Robots via Signal Imaging and Deep Neural Networks
    Yazdipaz, Kamyab
    Kohli, Nooshin
    Golestaneh, Seyed Ali
    Shahbazi, Mohammad
    IEEE ACCESS, 2025, 13 : 49018 - 49029
  • [35] Efficient Foreign Object Detection Between PSDs and Metro Doors via Deep Neural Networks
    Dai, Yuan
    Liu, Weiming
    Li, Haiyu
    Liu, Lan
    IEEE ACCESS, 2020, 8 : 46723 - 46734
  • [36] Aries: Efficient Testing of Deep Neural Networks via Labeling-Free Accuracy Estimation
    Hu, Qiang
    Guo, Yuejun
    Xie, Xiaofei
    Cordy, Maxime
    Papadakis, Mike
    Ma, Lei
    Le Traon, Yves
    2023 IEEE/ACM 45TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE, 2023, : 1776 - 1787
  • [37] MIXP: Efficient Deep Neural Networks Pruning for Further FLOPs Compression via Neuron Bond
    Hu, Bin
    Zhao, Tianming
    Xie, Yucheng
    Wang, Yan
    Guo, Xiaonan
    Cheng, Jerry
    Chen, Yingying
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [38] Epileptic Seizure Prediction Using Deep Neural Networks Via Transfer Learning and Multi-Feature Fusion
    Yu, Zuyi
    Albera, Laurent
    Jeannes, Regine Le Bouquin
    Kachenoura, Amar
    Karfoul, Ahmad
    Yang, Chunfeng
    Shu, Huazhong
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2022, 32 (07)
  • [39] Theoretical Foundations of Deep Learning via Sparse Representations A multilayer sparse model and its connection to convolutional neural networks
    Papyan, Vardan
    Romano, Yaniv
    Sulam, Jeremias
    Elad, Michael
    IEEE SIGNAL PROCESSING MAGAZINE, 2018, 35 (04) : 72 - 89
  • [40] Exploring Deep Spectrum Representations via Attention-Based Recurrent and Convolutional Neural Networks for Speech Emotion Recognition
    Zhao, Ziping
    Bao, Zhongtian
    Zhao, Yiqin
    Zhang, Zixing
    Cummins, Nicholas
    Ren, Zhao
    Schuller, Bjorn
    IEEE ACCESS, 2019, 7 : 97515 - 97525