Optimal pruning in neural networks

被引:5
|
作者
Barbato, DML
Kinouchi, O
机构
[1] Univ Sao Paulo, Fac Filosofia Ciencias & Letras Ribeirao Pret, Dept Fis & Matemat, BR-14040901 Ribeirao Preto, SP, Brazil
[2] Univ Paulista, BR-13043055 Campinas, SP, Brazil
来源
PHYSICAL REVIEW E | 2000年 / 62卷 / 06期
关键词
D O I
10.1103/PhysRevE.62.8387
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
We study pruning strategies in simple perceptrons subjected to supervised learning. Our analytical results, obtained through the statistical mechanics approach to learning theory, are independent of the learning algorithm used in the training process. We calculate the post-training distribution P(J) of synaptic weights, which depends only on the overlap rho (0) achieved by the learning algorithm before pruning and the fraction kappa of relevant weights in the teacher network. From this distribution, we calculate the optimal pruning strategy for deleting small weights. The optimal pruning threshold grows from zero as theta (opt)(rho (0), kappa)proportional to[rho (0) - rho (c)(kappa)](1/2) above some critical value rho (c)(kappa). Thus, the elimination of weak synapses enhances the network performance only after a critical learning period. Possible implications for biological pruning phenomena are discussed.
引用
收藏
页码:8387 / 8394
页数:8
相关论文
共 50 条
  • [1] Pruning Deep Neural Networks by Optimal Brain Damage
    Liu, Chao
    Zhang, Zhiyong
    Wang, Dong
    [J]. 15TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2014), VOLS 1-4, 2014, : 1092 - 1095
  • [2] Optimal pruning of feedforward neural networks based upon the Schmidt procedure
    Maldonado, FJ
    Manry, MT
    [J]. THIRTY-SIXTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS - CONFERENCE RECORD, VOLS 1 AND 2, CONFERENCE RECORD, 2002, : 1024 - 1028
  • [3] A node pruning algorithm based on optimal brain surgeon for feedforward neural networks
    Xu, Jinhua
    Ho, Daniel W. C.
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 1, 2006, 3971 : 524 - 529
  • [4] An optimal-score-based filter pruning for deep convolutional neural networks
    Sawant, Shrutika S.
    Bauer, J.
    Erick, F. X.
    Ingaleshwar, Subodh
    Holzer, N.
    Ramming, A.
    Lang, E. W.
    Goetz, Th
    [J]. APPLIED INTELLIGENCE, 2022, 52 (15) : 17557 - 17579
  • [5] An optimal-score-based filter pruning for deep convolutional neural networks
    Shrutika S. Sawant
    J. Bauer
    F. X. Erick
    Subodh Ingaleshwar
    N. Holzer
    A. Ramming
    E. W. Lang
    Th. Götz
    [J]. Applied Intelligence, 2022, 52 : 17557 - 17579
  • [6] HeadStart: Enforcing Optimal Inceptions in Pruning Deep Neural Networks for Efficient Inference on GPGPUs
    Lin, Ning
    Lu, Hang
    Wei, Xin
    Li, Xiaowei
    [J]. PROCEEDINGS OF THE 2019 56TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2019,
  • [7] On the use of a pruning prior for neural networks
    Goutte, C
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING VI, 1996, : 52 - 61
  • [8] Spectral Pruning for Recurrent Neural Networks
    Furuya, Takashi
    Suetake, Kazuma
    Taniguchi, Koichi
    Kusumoto, Hiroyuki
    Saiin, Ryuji
    Daimon, Tomohiro
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [9] Pruning product unit neural networks
    Ismail, A
    Engelbrecht, AP
    [J]. PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 257 - 262
  • [10] Automatic Pruning for Quantized Neural Networks
    Guerra, Luis
    Drummond, Tom
    [J]. 2021 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA 2021), 2021, : 290 - 297