Model reduction of feed forward neural networks for resource-constrained devices

被引:0
|
作者
Evangelia Fragkou
Marianna Koultouki
Dimitrios Katsaros
机构
[1] University of Thessaly,Department of Electrical and Computer Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
Scale-free; Network science; Model reduction; Training; Feed forward neural networks; Deep learning;
D O I
暂无
中图分类号
学科分类号
摘要
Multilayer neural architectures with a complete bipartite topology have very high training time and memory requirements. Solid evidence suggests that not every connection contributes to the performance; thus, network sparsification has emerged. We get inspiration from the topology of real biological neural networks which are scale-free. We depart from the usual complete bipartite topology among layers, and instead we start from structured sparse topologies known in network science, e.g., scale-free and end up again in a structured sparse topology, e.g., scale-free. Moreover, we apply smart link rewiring methods to construct these sparse topologies. Thus, the number of trainable parameters is reduced, with a direct impact on lowering training time and a direct beneficial result in reducing memory requirements. We design several variants of our concept (SF2SFrand, SF2SFba, SF2SF5, SF2SW, and SW2SW, respectively) by considering the neural network topology as a Scale-Free or Small-World one in every case. We conduct experiments by cutting and stipulating the replacing method of the 30% of the linkages on the network in every epoch. Our winning method, namely the one starting from a scale-free topology and producing a scale-free-like topology (SF2SFrand) can reduce training time without sacrificing neural network accuracy and also cutting memory requirements for the storage of the neural network.
引用
收藏
页码:14102 / 14127
页数:25
相关论文
共 50 条
  • [1] Model reduction of feed forward neural networks for resource-constrained devices
    Fragkou, Evangelia
    Koultouki, Marianna
    Katsaros, Dimitrios
    [J]. APPLIED INTELLIGENCE, 2023, 53 (11) : 14102 - 14127
  • [2] Iterative neural networks for adaptive inference on resource-constrained devices
    Leroux, Sam
    Verbelen, Tim
    Simoens, Pieter
    Dhoedt, Bart
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (13): : 10321 - 10336
  • [3] Iterative neural networks for adaptive inference on resource-constrained devices
    Sam Leroux
    Tim Verbelen
    Pieter Simoens
    Bart Dhoedt
    [J]. Neural Computing and Applications, 2022, 34 : 10321 - 10336
  • [4] Soft Error Reliability Assessment of Neural Networks on Resource-constrained IoT Devices
    Abich, Geancarlo
    Gaya, Jonas
    Reis, Ricardo
    Ost, Luciano
    [J]. 2020 27TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS (ICECS), 2020,
  • [5] Achieving High Efficiency: Resource sharing techniques in artificial neural networks for resource-constrained devices
    Gorbounov, Y.
    Chen, H.
    [J]. 1ST WORKSHOP ON SOLITON THEORY, NONLINEAR DYNAMICS AND MACHINE LEARNING, 2024, 2719
  • [6] Designing resource-constrained neural networks using neural architecture search targeting embedded devices
    Cassimon, Amber
    Vanneste, Simon
    Bosmans, Stig
    Mercelis, Siegfried
    Hellinckx, Peter
    [J]. INTERNET OF THINGS, 2020, 12
  • [7] Resource-Constrained Neural Architecture Search on Edge Devices
    Lyu, Bo
    Yuan, Hang
    Lu, Longfei
    Zhang, Yunye
    [J]. IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (01): : 134 - 142
  • [8] A multilayer feed-forward neural network (MLFNN) for the resource-constrained project scheduling problem (RCPSP)
    Golab, Amir
    Gooya, Ehsan Sedgh
    Al Falou, Ayman
    Cabon, Mikael
    [J]. DECISION SCIENCE LETTERS, 2022, 11 (04) : 407 - 418
  • [9] A Design Strategy for the Efficient Implementation of Random Basis Neural Networks on Resource-Constrained Devices
    Edoardo Ragusa
    Christian Gianoglio
    Rodolfo Zunino
    Paolo Gastaldo
    [J]. Neural Processing Letters, 2020, 51 : 1611 - 1629
  • [10] A Design Strategy for the Efficient Implementation of Random Basis Neural Networks on Resource-Constrained Devices
    Ragusa, Edoardo
    Gianoglio, Christian
    Zunino, Rodolfo
    Gastaldo, Paolo
    [J]. NEURAL PROCESSING LETTERS, 2020, 51 (02) : 1611 - 1629